Over a million developers have joined DZone.

Monitor Query Performance

How do you get information about how queries are behaving in SQL Server? In this post, we have a look at how to do just that.

· Performance Zone

See Gartner’s latest research on the application performance monitoring landscape and how APM suites are becoming more and more critical to the business, brought to you in partnership with AppDynamics.

Sooner or later, when you're working with SQL Server, someone is going to complain that the server is slow. I already pointed out in the first place that you should look when this comes up. But what if they're more precise? What if you know, or at least suspect, you have a problem with a query? How do you get information about how queries are behaving in SQL Server?

It's not enough to know that you have a slow query or queries. You need to know exactly how slow they are. You must measure. You need to know how long they take to run, and you need to know how many resources are used while they run. You need to know these numbers in order to be able to determine if, after you do something to try to help the query, you'll know whether or not you've improved performance. To measure the performance of queries, you have a number of choices. Each choice has positives and negatives associated with them. I'm going to run through my preferred mechanisms for measuring query performance and outline why. I'll also list some of the other mechanisms you have available and tell you why I don't like them. Let's get started.

Dynamic Management Views

Since SQL Server 2005, Dynamic Management Views (DMV) and Functions (DMF) have been available for accessing all sorts of information about the server. Specifically, there are a few DMVs that are focused on queries and query performance. If you go back through my blog, you can find tons of examples where I illustrate their use. You can also see them at work in commercial tools and free tools. Adam Machanic's sp_WhoIsActive, a free tool, makes extensive use of DMVs. To learn more about DMVs, you can download a free book, Performance Tuning with SQL Server Dynamic Management Views. DMVs are available in Azure SQL Database, Azure SQL Data Warehouse, and all editions of SQL Server.

The information captured by DMVs is an aggregation of all the times the query has been run. This means you can’t find how long the query ran at 3PM yesterday. You can though see the minimum and maximum time the query took as well as the average. The ability to see this information is what makes DMVs useful. However, another important point about DMVs is that they only collect information while a query is in memory. As soon as it leaves the cache (the area of memory it is stored in), so does all the aggregated information about the query in the DMVs.

You use the DMVs for a general understanding of how a query is behaving. They're not meant for detailed, long-term collections of information about queries. For that, we use other tools.

Extended Events

Introduced in SQL Server 2008, Extended Events (ExEvents) are mechanisms for capturing detailed information about SQL Server and the processes within. One of those processes is how queries behave. I have multiple examples on this blog on using ExEvents. You can't go wrong reading about them on Jonathan Kehayias' blog. Extended events are available in Azure SQL Database and all editions of SQL Server.

When you need to know every query against a database or each time a particular query is called, and all the details associated with the query (reads, writes, duration), ExEvents are the way to go. ExEvents are very lightweight on the server (but not free) and can be filtered so that you capture just the information you need.

The information is detailed and not aggregated. It's raw. The real issue with capturing this data is the amount of data you'll be capturing. Testing and careful filtering to ensure you're dealing with too much information is important. Prior to SQL Server 2012, there was no graphical user interface for reading ExEvent data, so you would have been forced to run queries against the XML that the information is captured within. With the tools available in SQL Server Management Studio, this is no longer the case.

You use ExEvents when you need specific and detailed information about a query. ExEvents are not so good for generalized monitoring.

Query Store

Introduced in Azure SQL Database and first released in SQL Server with 2016, Query Store is another mechanism for capturing aggregated information about queries. As before, I have examples of how to work with Query Store on my blog. You can also find quite a bit on it over at Simple-Talk. Query Store is still pretty specialized and only available in Azure and SQL Server 2016, but it is in all editions of SQL Server 2016.

Query Store captures information similar to what is available in the DMVs. However, unlike the DMVs, the information that Query Store captures is kept around, even after a query ages out or is removed from the cache. This persistence makes Query Store very exciting. You do have to choose to turn it on for each database you wish to capture queries for, so it's not automatic like DMVs. But the capture processes are asynchronous, so they should be relatively lightweight for most databases.

You use the Query Store when you need to capture query metrics over the long term but don't need detailed information and aggregations work well for you.


There are other ways to measure query performance. You can use the Profiler GUI, but that actually seriously negatively impacts the server. You can bring a server down by using it, so it should be avoided.

Profiler generates scripts called traces, which can be used to monitor your server. However, they have a much higher impact than ExEvents, and they're on the deprecation list. Microsoft is not adding new trace events for new functionality, so they're becoming less and less useful with each release. You also can't use trace against Azure.

If you're writing a query and just want to see how long it takes to run, you can use SET STATISTICS TIME ON, to capture the execution time. This is a handy way to quickly measure performance. There is also the ability to capture reads and writes using SET STATISTICS IO ON, but while this does capture the metrics we need, it adds considerable overhead to the query, skewing performance measurement. This is why I stick to ExEvents when I need an accurate measure.

People can disagree about the best way to capture query performance. I have my preferences, as you can see. However, I'm fairly certain that everyone would agree that it's important to know how to capture performance metrics in order to be able to assert that performance has increased or decreased in a measured fashion. You don't want to guess at query performance, you want to know.

The Performance Zone is brought to you in partnership with AppDynamics.  See Gartner’s latest research on the application performance monitoring landscape and how APM suites are becoming more and more critical to the business.

azure,views,server,query,sql server,sql,performance,slow,database

Published at DZone with permission of Grant Fritchey, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

The best of DZone straight to your inbox.

Please provide a valid email address.

Thanks for subscribing!

Awesome! Check your inbox to verify your email so you can start receiving the latest in tech news and resources.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}