Your Power BI reports and dashboards likely rely on an underlying data source being up to date and following some business rules. For instance, It may be a data warehouse schema which gets updated once a day at 3:00 am, collecting data from all 314 branches of your bank.
Your users want to be sure that the information they see is correct and up to date, so you want to be able to respond to any data issue as quickly as possible. You want to be sure that all information from all 314 branches has been received and processed at 3:00, and if there’s an issue, you need to know about it ASAP. There are many monitoring systems that can be utilised to inform you of issues with your database processes, and an easy one can be Power BI itself, using its data alerts feature.
Power BI’s data alerts work on dashboard tiles, and allow you to get notified if a KPI value reaches a threshold which you can define. Think about the example above. You want to know two things: that there has been a refresh in the last 24 hours, and that the refresh includes information from 314 branches.
My suggestion is to use a small dataset which includes one database view. This view has all the logic inside. For example, one of the fields is calculated (in the view) to compare the current timestamp with the last timestamp of a successful ETL run, and another one counts branches. Here’s a simplified example in T-SQL:
create view vDataChecks as
select datediff(HOUR,max(RunTime),getdate()) as “Hours from last RUN”,
SUM(case when RunTime>DATEDIFF(D,-1,getdate()) then 1 else 0 end) as “Number Of branches”
from ETLLog;
Again, this is a simplified SQL statement, which you can replace with your own logic, based on your own logs. Once you have such view in place, create a Power BI model which reads from this view with the direct query option. It’s important to use direct query and not import the data into Power BI, for two reasons. One, the data remains up to date, and this is crucial when you want to monitor how much time has past since your last successful run. The second reason is that date comparisons are better done in your database, to allow the two timestamp compared to use the same timezone, and not compare a timestamp from the database to a timestamp of the Power BI service.
Then, create a small report which shows your KPIs (in this example “Hours from last RUN” and “Number Of branches” ). Use a KPI or a CARD visual, one for each KPI. Then, pin these KPIs to a dashboard. You’ll then be able to define alerts on these dashboard tiles. Here’s such tile:
The value in the tile will change when your data changes, and in this case, since it’s showing a difference between your logs and the current time, your logs don’t even need to change – the change of time will cause the KPI to change. Defining an alert is simple. When you hover over the tile, you’ll see a three dots menu at the top right corner. Click on it and a small menu will appear. In the menu, click on the bell:
Then, a simple form will open, where you can define the threshold. Be sure to tick the “send me email too” checkbox if you want to be notified by mail. If you don’t, you’ll only see the alert on Power BI.
This is it. You can repeat this and add as many alerts as you would like.
Hopefully that helps, let me know if you have any questions in the comments below.