Dataflow Gen2 - Create tables in other data warehouse schemas
When using Dataflow Gen 2 to create new tables in Data Warehouse, I would like the ability to select which data warehouse schema the table will be created in.
Today, it is only possible to create new tables in the dbo schema from Dataflow Gen 2.
Make it easier to alter columns in Lakehouse tables
I wish that it was an easier and more efficient way to make changes to existing columns in a Fabric Lakehouse table.
Currently, if we want to alter or drop existing columns in a table, the best option for me is to use overwriteSchema. However this requires me to rewrite all th...
Lakehouse connector - Lakehouse.Contents()
Today the Lakehouse connector in Power BI Desktop has two options: Create a Live Connection to the Default Semantic Model, or Connect to the SQL Analytics Endpoint.
I think there should be a third option in the connector, which uses Lakehouse.Contents().
Or ...
Access and write OneLake data on paused capacity
If I have a pipeline, notebook, dataflow gen2, power bi semantic model, etc. in a workspace which is attached to an active capacity, then I think these should be able to also interact with OneLake data which resides in workspaces which are assigned to paused capacities.
After...
Lakehouse.Contents() documentation + connector
Hi,
I wish there was some documentation regarding how to connect to Lakehouse from Power BI Desktop in Import mode.
I want to connect directly to the Lakehouse (not SQL Analytics Endpoint) in Import mode.
I want to connect to the Lakehouse (not SQL Analy...
More complete lineage view in Fabric / Purview
Complete lineage view, including for example:
- across workspaces
- all Fabric items
- data which gets "pushed" (written) from an item to a destination
- data connections which are written in code (parameterized)
- etc.
Alter table drop column
Support for ALTER TABLE DROP COLUMN in Fabric Data Warehouse.
Managed maintenance of Lakehouse tables
I want Fabric to handle the maintenance of my Lakehouse tables for me.
I don't want to have to create and orchestrate Notebooks or Spark Job Definitions in order to maintain my tables.
For example, I want Fabric to manage the optimization and vacuuming of my t...
Lakehouse Table Optimization Control Panel
We would like a user interface (control panel) where we can schedule Lakehouse table maintenance operations like OPTIMIZE and VACUUM. Also the option to set the retention period for the VACUUM operations.
We would like to set table maintenance settings and schedule at the Capa...
Managed maintenance of Lakehouse tables
We would like Fabric to manage Lakehouse table optimization and maintenance for us.
E.g. OPTIMIZE and VACUUM.
This way, we won't need to set up and orchestrate complex Notebooks or Spark Job Definitions for maintaining and optimizing our tables.