“Fitting” Performance into the Software Development Lifecycle
This is a short excerpt from Chapter 1 of Pro .NET Performance, scheduled to appear in August 2012. I might be publishing a few more of these before and after the book is out. We have an Amazon page and a cover image now!
Where do you fit performance in the software development lifecycle? This innocent question carries the mind baggage of having to retrofit performance into an existing process. Although it is possible, a healthier approach is to consider every step of the development lifecycle an opportunity to understand the application’s performance better—first, the performance goals and important metrics; next, whether the application meets or exceeds its goals; and finally, whether maintenance, user loads, and requirement changes introduce any regressions.
During the requirements gathering phase, you should start thinking about the performance goals you would like to set.
During the architecture phase, you should refine the performance metrics important for your application and define concrete performance goals.
During the development phase, you should frequently perform exploratory performance testing on prototype code or partially complete features to verify that you are well within the system’s performance goals.
During the testing phase, you should perform significant load testing and performance testing to validate completely your system’s performance goals.
During subsequent development and maintenance, you should perform additional load testing and performance testing with every release (and preferably on a daily or weekly basis) to quickly identify any performance regressions introduced into the system.
Taking the time to develop a suite of automatic load tests and performance tests, to set up an isolated lab environment in which to run them, and to analyze their results carefully to make sure no regressions are introduced is a very time-consuming process. Nevertheless, the performance benefits gained from systematically measuring and improving performance and making sure regressions don’t creep slowly into the system is worth the initial investment in having a robust performance development process.