Mounika Reddy on 09 Sep 2019 19:48:20
Enable an option to find the size/storage of a dataflow entity in Power BI service and Premium Capacity Metrics App.
- Comments (11)
RE: View Dataflow data sizes
we need this option.
RE: View Dataflow data sizes
Definitely needed for an admin to keep track of data size which will allow them to:Decide if this needs cleanup or migration or spliting. See which tables are heavier than the others.
RE: View Dataflow data sizes
Much needed option.
RE: View Dataflow data sizes
Upvote for this idea. Need Dataflow size for both troubleshooting and maintenance purposes.
RE: View Dataflow data sizes
This is a critical feature for large organization. I hope they will be able to give this information to capacity admins soon.
RE: View Dataflow data sizes
It would be a nice, if Microsoft provide this option in the settings or through REST API.
RE: View Dataflow data sizes
For Power BI Pro there is a limit of 10GB usage limit but you are unable to find out your full usage, as dataflow size won't show.This is very bad as you can't plan at all, once you hit 10GB you can't get more data into your account and things just stop working, no way to plan and check. A very funny solution that Microsoft is delivering.We should have this fixed, or stop enforcing the 10GB limit on dataflows, until Microsoft can show you the size of your dataflows.
RE: View Dataflow data sizes
It is needed to check as we have many dataflows and trying to see each report storage and plan accordingly.
RE: View Dataflow data sizes
It is needed to check how much storage is consumed in the CDM Azure Data lake with dataflows. Especially if you only use Power BI Pro licensing model and should not exceed 10GB.
If we cannot check the size at one place in Power BI, we cannot know how big datasets, other objects and dataflows are together and how much they consume.
RE: View Dataflow data sizes
We should be able to know at a glance for each entity, the size and the number of rows