Hey there Power BI fans! Wouldn’t it be nice if you could refresh all of your datasets over all of your workspaces with a single click of a button? This tutorial will show you how to make this happen! With this technique, you will be able to refresh every single one of your datasets by executing a Python script housed in a single Power BI dataset. Before we get started, let’s see the magic in action 😊

The meat of the logic necessary to perform this technique will be contained in our Python script. The script will need to accomplish several tasks. Firstly, we must authenticate through the Power BI REST API to ensure we are who we say we are. Second, we retrieve a list of the workspaces available to us. Finally, we retrieve a list of datasets contained in each group and kick off a refresh for each dataset. We’ll come back to this logic when we write the script, I just wanted to explain the general flow.

If you’re wondering what the Power BI REST API is or what it can be used for, take a look at the documentation (https://docs.microsoft.com/en-us/rest/api/power-bi/). In short, you can use it to perform a large number of functions programmatically using HTTP requests. For example, if I want to retrieve a list of all of the Power BI datasets in “My Workspace,” there is an API endpoint for that information.

Before doing anything else, we need to register our future tool to use the Power BI API. Navigate to https://dev.powerbi.com/apps and fill out the form like such. Firstly, sign into the proper Power BI account. Give your application a name and select “Native” for the application type. Finally, check the boxes next to “Read all workspaces” and “Read and write all datasets” and click Register.

One more housekeeping item… Even though we have registered our application, we need to grant it permissions to use. Navigate to https://portal.azure.com/#blade/Microsoft_AAD_RegisteredApps/ApplicationsListBlade and select your newly created app. Navigate to the “API permissions” screen on the left-hand side and click the “Grant admin consent” button.

Nice, now let’s write our Python script. I’m not going to walk through each line individually, but feel free to reach out to me via https://bielite.com/#contact if you need a deeper explanation of the logic.


import adal
import requests
import json
import pandas

authority_url = 'https://login.windows.net/common'
resource_url = 'https://analysis.windows.net/powerbi/api'
context = adal.AuthenticationContext(authority=authority_url,
token = context.acquire_token_with_username_password(resource=resource_url,

access_token = token.get('accessToken')

groups_request_url = 'https://api.powerbi.com/v1.0/myorg/groups'

header = {'Authorization': f'Bearer {access_token}'}

groups_request = json.loads(requests.get(url=groups_request_url, headers=header).content)

groups = [d['id'] for d in groups_request['value']]

for group in groups:
    datasets_request_url = 'https://api.powerbi.com/v1.0/myorg/groups/' + group + '/datasets'
    datasets_request = json.loads(requests.get(url=datasets_request_url, headers=header).content)
    datasets = [d['id'] for d in datasets_request['value']]
    for dataset in datasets:
        refresh_url = 'https://api.powerbi.com/v1.0/myorg/groups/' + group + '/datasets/' + dataset + '/refreshes'
        r = requests.post(url=refresh_url, headers=header)

final = pandas.DataFrame()

Lines 1-4 simply import the packages necessary to run our code. Luckily, most of these packages are come built into to Python by default. In order to run this code in Power BI, you just need to install the Pandas and Matplotlib libraries. Check out my quick video on running Python scripts in Power BI for quick instructions (https://www.youtube.com/watch?v=EmALxO7V7y0). Authentication through the Power BI API is handled by lines 6-19, resulting in an “access_token” to be used by our subsequent API calls. Note that you must insert your client id, Power BI email, and Power BI password into their respective lines of code to make this work for you. Lines 21-27 return the list of available workspaces. Next, lines 29-35 loop through each workspace, retrieve the available datasets, and kick off a refresh of each individual dataset. Finally, line 37 creates and empty Pandas DataFrame. This only needs to be done because Power BI needs a DataFrame to be returned for the query to run.

At this point, we actually have a fully working solution if you want to run this code on a Python interpreter. In the next couple steps, we are going to house this Python script in a Power BI file so that we can run this code directly from Power BI Service. Open up Power BI Desktop and create a new query using a Python script and insert in our code from above.

Click OK and let the script run.  When the dialog box opens, select “final” to load the empty DataFrame from line 37 of our script to satisfy Power BI’s need for data. When complete, it will have actually refreshed all of your datasets for you! Simply close and apply your changes, save your file, and publish it up to Power BI Service.

The only task remaining to ensure that we can kick off this Python script on PowerBI.com is to install a Power BI Data Gateway in Personal Mode. This is extremely important as Python scripts can only be run on Power BI Service through the use of a personal gateway at the moment. To download and configure the gateway, follow this documentation: https://docs.microsoft.com/en-us/power-bi/service-gateway-personal-mode.

With that taken care of, all I have to do is kick off a refresh of my dataset containing the Python script! You might get a one-time error asking you to provide credentials for the Python script data source. Simply clicking the sign-in button alleviates the issue. And now for the final reveal… I will click the refresh button one more time and watch the magic unfold!

All of our datasets have been refreshed! Navigating to another workspace, I can see that these datasets were refreshed too!

Now you have a single Power BI dataset that can kick off the refresh of every other dataset that you have access to across all of your Power BI workspaces, even datasets and workspaces that have yet to be created. If you enjoyed this trick, make sure you subscribe to the BI Elite Blog and BI Elite YouTube channel. If you are in need to professional Power BI, DAX, and Power Query training, make sure to check out our training page for premium courses well below market value 😊

Parker Stevens

Parker is the creator of the BI Elite YouTube channel, a community of over 30,000 students learning Power BI, DAX, and Power Query. He is a Microsoft Data Platform MVP for his work with Power BI and continues to provide elite Power Platform training courses to help data analysts, BI developers, and citizen dashboard creators reach their goals. Parker is committed to producing high-quality training content that is also extremely cost-effective, to ensure that the largest amount of users can benefit from the content. Training courses located at https://training.bielite.com/

%d bloggers like this: