It’s taken a week longer than I had hoped, but I’m finally ready for my next installment of my “blog more” effort. As many of you are aware, I’m not a data person. However, there’s no ignoring the fact that applications are either a conduit for or generator of data. So it’s inevitable that I have to step out of my comfort zone from time to time. What’s important is to grab these opportunities and learn something from them.
The most recent example of this was when I had a partner that was faced with the deprecation of the Azure SQL DB Web/Business pricing tiers. Their solution is cost sensitive and when they looked at the Azure portal’s recommendations for the new tiers for their many databases, they had a bit of sticker shock.
With some help from folks like Guy Haycock of the Azure SQL team and a good starting blog post on SQL DB usage monitoring, the partner was able to determine that the high tier recommendations were due to their daily database backup processes. Since the new SQL tiers offer point in time restore, my partner didn’t need to do their own backups and as a result their actual needs were much lower then what the portal suggested, saving them a lot of money.
To identify what they really needed, all the partner did was pull the SQL diagnostic view data into Excel, and then visualized the data there. But I thought there had to be an even easier way not to just render it once, but produce an on-going dashboard. And to that end, started looking at Power BI, which is what this post is about.
The Diagnostic Management Views
The first step is to understand the management views and how they can help us. The blog post I mentioned earlier refers to two separate managements views the “classic”sys.resource_stats view which displays data for up to 14 days with 15 minute averages and the new sys.dm_db_resource_stats which have 1 hour of data with 15 second averages.
For the “classic” view, there are two schemas: the web/business version and the new schema. The new schema, is very similar to the new sys.dm_db_resource_stats schema, which is what I’ll focus on for our examples in this blog post. The key difference is that resources_stats is located in the Azure SQL DB “master” database, and dm_db_resource_stats can be found in the individual databases.
There are four values available to us: cpu, data, log writes, and memory. Each is expressed as a percentage of what’s available given the database’s current tier. This allows us to query the data directly to get details:
SELECT * FROM dm_db_resource_stats
Or, if we want to get a bit more complex, we can bundle values together per minute and chart out the mix, max, and average per minute:
SELECT distinct convert(datetime,convert(char,end_time,100)) as Clock, max(avg_cpu_percent) as MaxCPU, min(avg_cpu_percent) as MinCPU, avg(avg_cpu_percent) as AvgCPU, max(avg_log_write_percent) as MaxLogWrite, min(avg_log_write_percent) as MinLogWrite, avg(avg_log_write_percent) as AvgLogWrite, max(avg_data_io_percent) as MaxIOPercent, min(avg_data_io_percent) as MinIOPercent, avg(avg_data_io_percent) as AvgIOPercent FROM sys.dm_db_resource_stats GROUP BY convert(datetime,convert(char,end_time,100))
Using the new dm_db_resource_stats view my experiments consistently got back 256 rows (64 minutes) which is a nice small result set. This gives you a good “point in time” measure of a databases resource usage. If you use the classic resource_stats view (which you query at the master db level), you can get back a few more rows and will likely want to filter the result set back on appropriate database_name.
Since the values represent a percentage of the available resources, this lets us determine how close to the database tier cap our database is operating, thus allowing us to determine when we may want to make adjustments.
Power BI Visualization
With a result set figured out, we’re ready to start wiring up the PowerBI visualizations. For simplicity, I’m going to use the online version of Power BI. It’s an online database, so why not an online reporting tool. Power BI online is available for free, so just head over and sign up. And since this is a sort of “getting started with” Power BI article, I’m not going to dive into everything about the portal. So if you’re new to it, you may need to poke around a bit or watch a couple of the tutorials to get started.
After logging into Power BI, start by getting connected to your database.
Note: You’ll want to make sure that your SQL Database’s firewall is allowing connections from Azure services.
On the next page select Azure SQL Database and click connect. Then fill in the server name (complete with database.windows.net part) and the database name. But before you click on next, select “Enable Advanced Options”.
This is important because Power BI online does not see the system/diagnostic views. So, we’re going to use the queries we were working with above to get our data. Just paste the query into the “Custom Filters” box.
Another important item to point out before we continue is that since we’re using an Azure SQL DB, we can’t have a refresh interval of less than 15 minutes. If you need more frequent updates, you may need to use Power BI Desktop.
With this form completed, click on “Next” and provide the username and password for your database and click on “Sign in”. Since our query is fairly small, the import of the dataset should only take a few seconds and then we’ll arrive at a dashboard with our SQL database pinned to it. Click on the database tile, and we get a new, blank report that we can start to populate.
Along the right side of our report, select the “Line Chart” visualization and then check AvgCPU, MaxCPU, and MinCPU in the field list. Then drag the Clock field to the Axis field. This setup should look something like this:
If done properly, we should have a visualization that looks something like…
From there you can change/resize the visualization, change various properties of it (colors, labels, title, background, etc.) by clicking on the paint brush icon below visualizations. You can even add other visualizations to this same report by clicking away in the whitespace and adding them like so...
It’s entirely up to you what values you want to display and how.
So there you have it
There's not really much to it. The hardest part was figuring out how to connect Power BI online to the management view. I like that the Power BI desktop tool calls it a “query” and not a “filter". Less confusion there.
If I had more time, it would be neat to look at creating a content sample pack (creation of content packs is not self-service for the moment) for Azure SQL DB. But, I have a couple other irons in the fire, so perhaps I can circle back to that topic on another day. So in the interim, perhaps have a look at the SQL Sentry content pack (Note: This requires SQL Sentry’s Performance Advisor product which has a free trial).
Until next time!