Assertions in JMeter are used to compare actual test results to the expected results of requests. This post will explain that even though there are ways to get around using assertions, it is highly recommended to indeed utilize them.
When we plan load test workloads, assertions are generally not utilized. The primary reason is that in load or performance testing, the response status code serves as one of the main metrics in analyzing your results. There is not much of a need to verify the response data of each request. What we do need is the response time, latency, and response code. Theoretically, we don't need to use JMeter assertions but, in practice, ignoring assertions might play havoc with the final results.
A 200 Status Code Doesn't Always Mean Success
In order to illustrate the importance of assertions in JMeter, I want to share a real case which happened to me while load testing a SaaS service.
Before the release, I was asked to perform a load test of the SaaS company's website, which provides services in the tourism industry. The test scenario was not complex—it included a sign-in, some links, and a sign-out. Seems pretty straightforward, and I wasn’t expecting anything to go wrong.
When the JMeter script was ready, I uploaded the JMX file into BlazeMeter and began to run the test. All the requests passed with the 200 status code and the average response time was less than 1000 ms, which means that the web application was super fast. Increasing the number of concurrent users and performing the load test again did not change the average response time, and the application still loaded very quickly without any errors.
I began to think. On the one hand, it was good that the application is stable, but on the other, it was strange to me that the average response time remained the same.
Before sending the load test report to my client, I decided to double-check the results locally. When all requests are successful, it's difficult to find the root cause, but after a long time investigating, I decided to add assertions to each request, and I was amazed at the results in the Assertion Results listener, which reveals the label under which all the assertions were taken. The listener displayed almost 90% of the requests as failed, but in the View Results Tree listener—which shows a tree of all sample responses, allowing you to view the response for any sample—everything was right.
I discovered that the root cause of the discrepancy was with the sign-in sampler. The CSV data that was used for credentials was outdated, but instead of returning a 4xx status code (which would indicate it as unsuccessful), it was redirecting to a maintenance page with 2xx (successful) status code. The rest of the requests were also returning to the maintenance page content and the page was hosted on a CDN. While we were loading the company’s web servers, in reality all requests were sending us to a CDN.
As a result of this experience, we can conclude that a successful response status message by itself does NOT always indicate success, and that to solve this problem, JMeter assertions can help.
When To Use JMeter Assertions
The above case is not the only example of when JMeter assertions are important to use in load testing applications. For instance, in API performance testing, JMeter assertions are a must-have in any test script.
Here are some recommendations of when to use JMeter assertions:
- With requests that use data from "CSV Data Set Config"
- With HTTP POST/PUT/PATCH methods
- After signing in and signing out
- When 2xx status codes returned in both positive and negative testing
- In API performance testing
- In functional testing
- With SOAP/XML-RPC requests
Word of Warning: Assertions Consume Memory
In the official JMeter documentation, it is written to "use as few assertions as possible", and the main reason for not adding them to all samplers is resource consumption. Using assertions irregularly in high-load testing will cause performance issues or even Out Of Memory errors.
Here is the list of assertions I would suggest you avoid using in your load tests:
To find the impact of the above assertions, I made the benchmarking test that of JMeter itself. I used a JMX file from a "JMeter Performance evolution across versions" test.
- JMeter version: 2.13
- Tomcat version: 6.0.39
- Tomcat JVM: -Xmx128M
- JMeter JVM: -Xmx512m
- Java version: “1.7.0_95"
- Ubuntu: 14.04.3 LTS
- JMeter and Tomcat are on the same machine
- No particular OS Tuning
- Simple test plan using Tomcat examples
The Test Plan parameters:
- Ramp-up period: 100 seconds
- Number of Threads: 1500
- Duration: 10 minutes (so the test will run for just 10 minutes, coming to a forced stop at the 10th minute)
- Startup Delay: 7 seconds
1. Benchmark Testing Without Assertions
2. Benchmark Testing With Response Assertions
3. Benchmark Testing with XPath Assertions
As you can see from charts above, the behavior of RAM is stable in all tests. If we analyze the CPU results, we see that there are no big changes in the test with response assertions, but in the test with XPath assertions, the CPU consumption increased 10%. The test plan included only 4 samplers and 4 XPath assertions. Adding more assertions will drastically increase CPU consumption, which will lead to performance issues.
JMeter assertions are a must-have in load and performance testing, especially when a server returns dynamic data with non-standard response codes. Ignoring JMeter assertions while the server is returning static plain-text data might be ok, but extra additional verification is very recommended. A side effect of assertions is resource consumption, and you can't use each and every assertion in your load test.
I recommend balancing between performance and functionality while writing a test plan in JMeter. In distributed load testing, when you run a load test by yourself this can save significant time and money.
If you’d like to learn more, please sign up for our free online JMeter training course.