OneLake file explorer - Support for more accounts without crashing or falling down
Hi,
I think that is more important that OneLake file explorer could support more accounts without crashing.
For my experience, when I start my machine, access to the file explorer using a particular account, then accomplish a logout and try to access by another account the tool fal...
Reporting the column that causes a String or binary data would be truncated error
Hi,
recently working with a dataflow gen2 I've obtained a such error:
"Mashup Exception Data Source Error Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: DataSource.Error: Microsoft SQL: String or bi...
Using a lakehouse view as a source for a copy activity
Hi,
currently it isn't possible to use a lakehouse view as a source for a copy activity.
I think that a such feature could be very important. Creating a view against a lakehouse could allow some little data transformations before copying from a source lakehouse to a destination lak...
Managing better the New table and the Existing table options for a Data destination
Hi, I've accomplished some proofs to understand the behaviour of the New table option: if specified then the Dataflow gen2 checks the table existence and so if it is necessary creating the new table unexisting. A such behaviour is good but a little hidden and not immediate to understand; it is...
Complete and functional documentation to describe the all Fabric security levels following the point of data view
Hi,
I'd like to dispose a good, synthetic and functional documentation to describe the all Fabric security levels following the point of data view, from the highest level to the lowest one.
A possible description could present these levels in the following hiearchical order:
Upsert operation for writing into a destination Lakehouse
Hi,
I think that could be very useful to have the upsert operation when in eventstream it is selected a Lakehouse as a destination.
So, respect to an event identifier it could be possible updating the existing events and inserting new events in the lakehouse.
Thanks
Having a warehouse as an eventstream destination
Hi,
I think that it could be useful to be able writing also onto a warehouse, f.e. as a secondary repository to historicize or accumulate source data.
Thanks
Improving the error message when a data type conversion issue occurs writing onto a lakehouse
Hi,
for an evenstream that reads from an Azure Event Hubs to write the events onto a lakehouse I've encountered a data type conversion error and looking to the Run-time logs I've seen a such generic error message:
{
"ErrorCode": {
"value": "Warning",
Showing the columns of a KQL materialized view in the KQL databases explorer
Hi,
I think that it could be very useful to see the component columns when a KQL materialized view is created, in the KQL databases explorer.
I can see the columns of a table and not of a materialized view.
Also it could be very useful seeing the related data types. Thanks
Managing or reducing the data retention for an evenstream
Hi, for testing purposes using an Azure Event Hubs as a source, I need to change whatever about the composition of the input events: a column to add or remove, a column type to change, a column value to modify and so on.
For each source change it occurs to get again the source events in ...