If you are a Power BI admin, you probably have had one (or several) occasions when you needed to know what is happening in your Power BI tenant. You probably have heard about the Power BI REST APIs and particularly of the Get Activity Events API. You might have experimented calling the API in the MS documentation pages with the Try It feature. It is a very handy way to see what information a particular API returns and what are the components needed for the call. Unfortunately the Try It feature is broken at the moment. But the Power BI team is working on to fix it.
Why use Power Automate?
The API returns data only from the past 30 days. That is quite short period, so the goal is to automate a process to store the data somewhere. For this you have different options. Some of them are described in this JustB smart’s blog post. The problem is that many of these options require PowerShell and/or more advanced Azure components and that might be too much for many citizen-dev admins. All though this flow requires some expressions (aka code), it is still easy approach for non-programmers.
The plan is to create a Power Automate flow that will run every morning to get the events from the previous day and store them into Azure data lake storage Gen2 in a JSON format.
The flow looks like this:
Service principal is used for authentication in this example. If you are already familiar with this, please continue reading, if not, read my earlier post before continuing.
After authentication and retrieving the access token we can get to the actual topic, reading the Activity Events API.
The GetActivityEvents differs a bit from the other Power BI REST APIs because of the number of rows it might return. For this some filtering and do until loops are required and all this is well documented in here and here.
The process is split in to two parts.
- First part is the initial Get request in which we specify the time range. The range needs to be within the same day! That initial request will not return all rows unless the time range is less than an hour and there are less than 5000 to 10000 entities.
- In all other cases, a continuation token is returned to be used in the next call to get the next batch of entities.
That’s basically a summary what the linked documents above will tell you.
So how should that be done in Power Automate?
First GetActivityEvents request
For that, we compose a value of yesterday’s day in a yyyy-MM-dd format.
Next the start and end time range is determined in the first call. We start midnight and end a second before midnight. The varEndDate value will be concat(variables(‘varActivityDate’), ‘T23:59:59.000Z’).
The first call format needs to be something like this: https://api.powerbi.com/v1.0/myorg/admin/activityevents?startDateTime=’2019-08-31T00:00:00’&endDateTime=’2019-08-31T23:59:59′. (so the ending .000Z I have in the example are not really needed..).
This first call will return the first set of activity events and a continuation token that is then used in the next call. The format for that is “ https://api.powerbi.com/v1.0/myorg/admin/activityevents?continuationToken=’continuationtokenhere‘ ”.
Following requests with the continuation token
First we need to parse JSON from the first call and extract the continuation token to a variable. Here we use a formula to check if there is a continuation token. If token is empty we set the variable as null. This will be later used in the Do until loop.
Note that the activityEventEntities might also return a null array but still there is a continuation token. This happens because the call is automatically split into per hour queries. If there are no events during the first hour, the set is empty. But knowing this, there should be minimum of 23 loops for every day.
The loop looks like this. Do until the varContinuationToken is not empty, get the events and parse JSON, append to the varActivityEvents and update the continuation token variable if a token is still returned. Make sure to remove all “required” parts in the parse JSON Schema. The activity events are very varied and there might be hours without a certain type of activity that might make the flow fail if some content is required.
Set the variable to null if token is not returned.
Save JSON file to Blob Storage
Last step is to save the varActivityEvents content to Blob storage. The content is formatted so, that the last comma is removed and a “]” added so the file is in normal JSON format.
End result should be a separate JSON file for each day in the blob container folder. Blob name expression here is concat(‘activityevents_’, variables(‘varActivityDate’), ‘.json’).
Don’t forget error handling!
The final step in my flow is to post a message to a Teams channel if, for some reason, the flow fails. You should always include at least a light error handling to your flows. You can read more of that in Timo Pertilä’s blog post of error handling. In this case, if we would be unable to run the flow for few days it is always possible to manually set the varActivityDate to an older date (within the 30 days available). Often this type of light error handling is enough.
Having the activity events history available, you can create Power BI reports to follow user and usage trends over longer time periods than just 30 days. Or Power Automate flows to react if unwanted actions are happening in your environment etc. Also note that the data this API provides is already quite extent and new content is constantly added. Because of this, loading the data to Power BI might not work without some modifications in the Power Queries.
And remember, if you need help, we’re here too.
Interested in reading our latest Power Platform blog posts?
You’re welcome to subscribe to our email newsletter: Forward Forever Monthly. Max 1 email per month, with a curated list of latest insights and articles on Power Apps, Power Automate, Power BI.