George Cristian Prifti on 12 Jan 2021 10:15:19
Since now we have this very nice new feature, the only downside is now having either a functionality to trigger multiple dataflows sequentially on the condition they all succeed.
Right now it will initiate the refresh action, but it will not wait for it to also finish and check the completion status, hence there are cases where the completion is a pre-requisite to move on the next DF or Dataset refresh action.
Let's say there are 6 sequential dataflows and a dataset in the end, creating and maintaining 5 different flows would not the best solution.
- Comments (3)
RE: Power Automate Dataflows Refresh Action
https://powerbi.microsoft.com/en-us/blog/announcing-dataflows-power-automate-connector-public-preview/
"Both a Power BI Dataset and Dataflows can get data prepped by another dataflow, leveraging that dataflow as a data source. One reason for refreshing multiple dataflow sequentially is explained in our documentation about separating complex dataflows into multiple dataflows. When the Dataflow that acts like a data source completes its run successfully, a downstream, or dependent Dataset and/or Dataflow should be refreshed ASAP to reduce time from when Data is available to when it can be used or further processed. When a dataflow fails, the dependent dataset or dataflow does not need to be refreshed, to conserve resources and also be ready to be refreshed as soon as the upstream dataflow is fixed and its refresh is successful."
This post is highly misleading. There should be an option to allow wait until the refresh of the dataflow either is successful /or has failed for "Refresh a dataflow" action.
RE: Power Automate Dataflows Refresh Action
Agree, we need to be able to logically control refreshes based on success or failure to better optimise our refresh schedules.
RE: Power Automate Dataflows Refresh Action
I have two Dataflows which are processed in parallel. Both are feeding into one data set. I would like to trigger the dataset refresh after the 2nd Dataflow has completed. I could solve it now only by running the two dataflows and the dataset refresh sequentially, but loosing time with this approach. Hence a conditional trigger which checks for completion of a set of dataflows would be needed.