DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports Events Over 2 million developers have joined DZone. Join Today! Thanks for visiting DZone today,
Edit Profile Manage Email Subscriptions Moderation Admin Console How to Post to DZone Article Submission Guidelines
View Profile
Sign Out
Refcards
Trend Reports
Events
Zones
Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Partner Zones AWS Cloud
by AWS Developer Relations
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Partner Zones
AWS Cloud
by AWS Developer Relations

How to Achieve Better Accuracy in Latency Percentiles in JMeter Dashboard

When doing performance tests, accuracy is all important. See how two performance engineers used JMeter to increase the accuracy of their tests.

Malith Jayasinghe user avatar by
Malith Jayasinghe
·
Viraj Salaka user avatar by
Viraj Salaka
·
Dec. 06, 17 · Analysis
Like (4)
Save
Tweet
Share
9.24K Views

Join the DZone community and get the full member experience.

Join For Free

IntroductionImage title

There are a number of ways to evaluate the performance of systems using the data collected during a performance test. Latency analysis is one such important analysis technique in which we analyze the behavior of the latency. This analysis can be as simple as calculating the average latency/mean latency/latency percentiles or it can be a rather complex process in which we fit distributions to the data to study the characteristics of the latency distribution.

The "latency percentile" is an important performance metric which is used to analyze the latency. Since it measures the percentage of requests that has latency below some value, it can be considered as a metric that measures the quality of service of the application/system being evaluated. For example, if 99% latency percentile of your system is equal to 5 ms, it means that 99% of requests served by the system will have latency below 5ms. In the case of large datasets, there are methods to estimate the latency percentiles. The accuracy of results produced under these methods may vary depending on the underlying algorithm and parameters used.

Apache JMeter™ is a great tool which has been designed to load test the functional behavior of applications and measure their performance.  At WSO2, we use JMeter to test the performance of most of our products. JMeter has a great set of features such as the ability to test various protocols/applications/servers, an IDE that allows fast test plan development, dynamic HTML reporting, multithreading and scriptable samplers, and the ability to test using a large number of concurrent users (which is achieved by running multiple instances of JMeter).

When we run performance tests, we can configure JMeter to create text files containing the results of a test. These files are called JTL files. Since the JTL files contain latency values for each request, we can use this information for latency analysis. This can be done using various listeners (e.g. Aggregate Reports) that are already available in JMeter or by loading the JTL file into a statistical software (such as R). 

JMeter Dashboard

Recently, we have started using the JMeter Dashboard for obtaining performance results. JMeterDashboard can generate graphs and statistics from the JTL. While analyzing the latency percentile values in the JMeter dashboard, we noticed that, for certain scenarios we tested, there was a significant difference in the actual (exact) latency percentile values and the percentile values calculated in the JMeter dashboard. The exact value was calculated using R (statistical software package). Interestingly, enough JMeter aggregate reports produced the same result as R. 

For example, see the following result:

Average Latency (DashBoard)

90th Percentile (Dashboard)

95th Percentile (Dashboard))

99th Percentile (Dashboard))

Throughput (Dashboard))

Average Latency (Exact Value)

90th Percentile (Exact Value)

95th Percentile (Exact Value)

99th Percentile (Exact Value)

Throughput (Aggregate Report)

87.47

170

996

2296.97

5706.3

87.47

70

321

2009

5706.3

Note the following:

There is no difference in the average latency.
There is no difference in the throughput.
90% is significantly higher in the dashboard.
95% is significantly higher in the dashboard.
99% is higher in the dashboard.

The above result was obtained by loading JTL file of a 10 min performance test. The total time of the test was 15 min and the first 5 min was the warm-up period. The total number of requests in the test was 3421980. 

Improving Accuracy in the Latency Percentiles

The way to address the above is to increase the default value of the following property: jmeter.reportgenerator.statistic_window. Note that this property only affects the latency percentile values (because it is only used in the PercentileAggregator class, the component implemented for latency percentile calculation in JMeter).

The following table shows the impact of statistic_window on the results. Note that the number of samples = 3421980


Average Latency

90th Percentile

95th Percentile

99th Percentile

Throughput

Dashboard: statistic_window=20k (default)

87.47

170

996

2296.97

5706.3

Dashboard: statistic_window=200k

87.47

81

394

2057

5706.3

Dashboard: statistic_window=500k

87.47

72

355.95

2013

5706.3

Dashboard: statistic_window=1000k

87.47

70.9

336

1993

5706.3

Dashboard: statistic_window=10000k

87.47

70

321

2009

5706.3

Dashboard: statistic_window = 3000k

87.47

71

324

2017

5706.3

Dashboard: statistic_window = 3421980

87.47

70

321

2009

5706.3


Exact value = R result = Aggergate Report Result

87.47

70

321

2009

5706.3

statistic_window= sample count

When statistic_window= total number of samples, then we get 100% accuracy (i.e. exact value) in the dashboard results.

jmeter.reportgenerator.statistic_window < sample count

When jmeter.reportgenerator.statistic_window < sample count, the last static_window number of samples in the JTL file is used for calculating the latency percentiles and this is the reason why we do not get the exact result. The following diagram shows samples used when statistic_window=20000 (i.e. default).

Image titlejmeter.reportgenerator.statistic_window = -1 

We can get 100% accuracy (i.e. exact result) in the latency percentiles if we do the above.

Further Analysis

As pointed out above, the JTL file which we analyzed consisted of 3421980 samples. We now create multiple JTLs from the original JTL (with 3421980 samples). Each of these JTLs consists of 20000 samples. For each JTL we compute the percentile values using the Dashboard report (using default window size). In this case, the Dashboard report should produce an exact result (due to sample-count = default window size = 20000).

JTL 1

 




JTL 2







JTL 3





The objective is to investigate the deviation in the latency percentiles when you use different subsets of data from the original dataset. The results are shown below:


Average Latency

90th Percentile

95th Percentile

99th Percentile

Throughput

Sample 1 (exact value)

87.8

72

285

1966.95

3047.85

Sample 2  (exact value)


113.5

130

614.95

2007.97

2979.74

Sample 3  (exact value)


134.39

170

996

2296.97

2598.75

Exact value obtained using R/JMeter aggregate report (using all data)

87.47

70

321

2009

5706.3

We note that there is a significant variation in the percentile values among different samples. In fact, for this particular test, we note that the values have become worse as the time progresses. The possible reasons for this behavior are:

  • 20000 (default window size) samples are not sufficient to capture the behavior of the full latency distribution.

  • The system has not arrived at a steady state. This means that we need to increase the warm-up period.

  • Other reasons we have not covered.

JMeter Memory

It is worth pointing out that when you increase the statistic_window you may need to increase the memory you allocate for JMeter. In this particular test, we allocated 2GB of heap memory for JMeter.

Conclusion

In this article, we have discussed the use of latency percentiles as a metric for measuring the performance and how to increase the accuracy of results that appear in the JMeter dashboard. We noted that there is a way to get the exact result (i.e. 100% accuracy). This can be achieved by setting jmeter.reportgenerator.statistic_window = -1, i.e. infinite window. However, when you set this property at -1 you may need to increase the amount of memory you allocate for JMeter, in particular, if you have a large number of samples. If there is not enough memory, then you can simply increase the default value of this property to a higher value which will increase the accuracy of the results. In the article, we investigated the impact of window size on the accuracy.

Percentile Dashboard (Mac OS) Testing

Opinions expressed by DZone contributors are their own.

Popular on DZone

  • Rust vs Go: Which Is Better?
  • Master Spring Boot 3 With GraalVM Native Image
  • Use Golang for Data Processing With Amazon Kinesis and AWS Lambda
  • GitLab vs Jenkins: Which Is the Best CI/CD Tool?

Comments

Partner Resources

X

ABOUT US

  • About DZone
  • Send feedback
  • Careers
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 600 Park Offices Drive
  • Suite 300
  • Durham, NC 27709
  • support@dzone.com
  • +1 (919) 678-0300

Let's be friends: