Applying the Scientific Method to Performance
Need a structured approach to improving the performance of your web apps? Look no further than what you learned in high school science class.
Join the DZone community and get the full member experience.Join For Free
The scientific method is an elegant, yet simple, means of uncovering the truth. In performance, the truths for which we search include:
- How fast is my site?
- What will cause my application to break?
- Where should I focus to get the greatest ROI?
- Will I survive an unexpected (or expected) spike?
- How well am I serving mobile users vs. desktop users?
As with any significant scientific discovery, there are a complex combination of variables that contribute to finding, and sometimes obscuring, the truth.
Great performance teams, perhaps without realizing it, adhere rather closely to the scientific method. One definition of “science” is “the practical application of observation and experimentation to the physical world”. The goal is discovery. Isn’t that what we do when we seek to optimize performance? For performance teams, it’s measuring the experiences of our users, experimenting with architectures, and testing infrastructure — all for the practical application of understanding performance in the digital world.
The scientific method is defined by a series of steps, a framework that provides a common, reproducible approach for all scientists to follow:
1. Ask a Question
Curiosity is likely one of the strongest drivers for those who enter the scientific community. The best performance team members are naturally curious. They look for new and creative ways to improve.
2. Do the Research
Performance engineers, web performance analysts, and performance architects get good by understanding what affects performance, and what impact performance has on a user community. They look at the performance of competitive or peer websites to learn what makes them better, or worse. They work hard to understand the full depth and breadth of the application, website, infrastructure, and architecture.
Scientists start with a hypothesis — a hunch. From that, the scientist will develop a theory, which may eventually be established as a scientific law. For example, you may have heard that earlier this year scientists claimed the first direct evidence of the existence of gravitational waves, confirming one of the last unproven tenets of Einstein’s general theory of relativity.
Einstein started with two major hypotheses: that the laws of physics are the same everywhere and that the speed of light is constant. Performance teams may start with the hypothesis that user behavior is impacted by their experiences on your digital platforms. Our theory might then be that better performance on our product pages will result in higher conversion rates, or that faster load times of news pages will expose media site visitors to more ads. The next step is to support or disprove those theories.
Scientists have collaborated over time to support Einstein. When Einstein published his theories of relativity, not everyone was willing to immediately discard two centuries of Newtonian mechanics. So it was time to observe, to test, to measure – and to do so repeatedly.
Scientists, as well as performance teams, don’t trust a single result or a single user interaction to support or disprove a theory. Savvy teams don’t perform a single test in the run up to Black Friday to assess their readiness. They understand that having more data is good, and including ALL of the data in an analysis is even better.
5. Analyze Data
The data used by scientists include both quantitative data, the numbers that we measure, and qualitative data, the observed behavior. With SOASTA mPulse, for example, you gather important performance metrics and then correlate those to behavior: what does a real user do on my site under various conditions?
Data scientists are an increasingly common member of performance teams. Given enough data points (the difference between sampling and having all of the data), the data scientist can develop reproducible output to not only prove certain correlations, but also become predictive.
6. Draw Conclusions and Communicate Results
It’s well documented that performance has an impact on user behavior, but exactly how much? Conversion impact and What-if analysis in SOASTA mPulse can help you identify where to focus your performance improvement efforts, and the projected financial return. Finding the breaking point of your infrastructure — before it actually breaks — is critical for developing strategies for preventing an outage. A mobile-specific approach may well be needed to service an increasingly important customer segment.
Those are just a few areas of insight to be gleaned from the data. And the ability to communicate the information in compelling ways — such as the globe in mPulse or the correlation of disparate data series in SOASTA CloudTest — is critical to supporting the actions you then take.
Following the Ccientific Method is a Team Game
Hundreds, if not thousands, of scientists participated in supporting Einstein’s theories. It’s not working in a vacuum, but rather sharing ideas that drives advancement.
In performance, collaboration is key. There are many who contribute in testing and supporting our readiness. Developers are increasingly common participants in measuring and testing performance. Ops and business users play critical roles, while a strong performance team orchestrates these efforts.
The Digital World, Just Like the Physical World, is Complex
Conditions change, which requires analysis, repetition and responsiveness to support your readiness for a big media event, a marketing promotion, the release of the hot new athletic shoe, or even a simple change to the website.
Published at DZone with permission of Dave Murphy, DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.