
When Microsoft introduced Power BI dataflows at the end of 2018 to separate no/low code cloud ETL from BI modeling and visualization, many people were initially confused about how refreshes would work. Contrarily to what might seem like the intuitive behavior, dataflow and dataset refreshes are separate and unconnected. However, it does make sense to tie them somehow, as you’ll most likely want to refresh datasets after their source dataflows were themselves just updated.
I wrote the high-level outline for how to do so with the Power BI APIs and Power Automate (then Microsoft Flow) in the Power BI forums a while ago. A few months later Microsoft added a Power Automate action to execute a dataset refresh, but the dataflow equivalent remains to be seen. You thus still need to register a Power BI app and create a Power Automate connector for dataflows, following the procedures explained in these entries:
- Chris Webb: Calling The Power BI Export API From Power Automate, Part 1: Creating A Custom Connector
- Konstantinos Ioannou: Refresh PowerBI dataset with Microsoft Flow
The main benefit of using a custom connector is to ease authentication. If we were using an anonymous API we could just use the HTTP request Power Automate action. You can follow the steps above almost verbatim for dataflows, whose refresh POST call is documented here. But bear in mind these two specific points to be added to your custom connector:
- You need to add RefreshRequest: “y” in the body of your request definition. Otherwise, the test will work in the custom connector settings, but the actual calls from a flow will fail.
- Optionally you can use NotifyOption: “MailOnCompletion” as well to get a notification upon refresh failure or success.

Note that Custom connectors are now found under Data in the Power Automate sidebar, not under Connectors (that would make too much sense) and no longer from the Settings wheel as you’ll see in posts from 2018/2019.
Some goodies you can add to your flow to better integrate refreshes in your users’ workflow:
- Let your users trigger the whole refresh process from a Power Automate button / link, put a refresh button right in your Power BI reports, or set up a schedule.
- If your dataflows or datasets need to be routed through the data gateway, assuming said gateway is in an Azure VM, you can start/stop them via Power Automate actions. There’s no native Power Automate action to check the existing status of a VM though.
- Wrap things up with notifications by email, to Teams, or otherwise, there’s a slew of competing Power Automate actions for that. If you want to be thorough and accurate, you may put these notifications in a separate flow that’s triggered by the parsing of refresh notifications as set up in the custom connector, do error handling etc.
In the screenshot opening this entry you may guess that I hardcoded the group and dataflow IDs in my custom connector calls, which I had looked up manually from the dataflows’ URLs. That’s not really a scalable or maintainable process. I assume one could use the Get Dataflows call, dump the results in a spreadsheet, and iterate over the group/dataflow value pairs in the flow, but I didn’t get around to doing so yet.
Finally, if you feel adventurous you can hack your way into retrieving a dataflow refresh history.