The Five Steps Required to Enable Quality@Speed
The Five Steps Required to Enable Quality@Speed
Speedy software delivery requires planning; learn how to incorporate continuous testing into your DevOps processes for faster delivery of quality software.
Join the DZone community and get the full member experience.Join For Free
Get the fastest log management and analysis with Graylog open source or enterprise edition free up to 5GB per day
Everyone wants higher quality software, faster. The demands on modern software development teams are immense — from increased competition and market pressures, increased functionality and complexity, to higher expectations of product quality, security, and reliability. Agile development methods are often sought after because of the promise of being more responsive to change and better at achieving customer requirements.
But agile and (more recently) DevOps are often sold as a way to get software done faster with fewer resources, despite this not being the intention. Realistically, with as many as 70% of IT projects failing or falling short of their goals, smart development teams are looking to improve their development practices so they can not only succeed with a project, but also create a repeatable process for future iterations and products. In this post, we'll talk about how to achieve the agility needed for agile and iterative methods, while not just achieving an end product but a product that meets and exceeds quality and security goals.
Continuous Testing Is the Answer to Quality@Speed
As it turns out, testing is both the problem and the solution to achieving better quality faster. In an agile process, many of the development steps can be shrunk in order to create reasonable pieces of functionality to design and implement; however, integrating the new functionality is risky and the scope of testing is unclear. As I talked about in a previous post, testing is one of the key reasons software teams struggle when adopting agile methods. Teams lose the agility they are striving for because they get bogged down in testing too much or not enough.
Continuous testing is seen as a solution to the issues faced by software teams adopting DevOps and agile development. Wikipedia defines Continuous testing as “… the process of executing automated tests as part of the software delivery pipeline to obtain immediate feedback on the business risks associated with a software release candidate.” Despite the straightforward definition, implementing continuous testing and optimizing it over time is another thing entirely, and that's what I'll focus on here today.
Turning Your Ice Cream Cone Into a Pyramid
The ideal test pyramid defines where it’s best to invest time and effort in a project. In the ideal pyramid, you invest your valuable time and effort into a comprehensive suite of unit tests at the foundation of the pyramid, which is backed up by API and services tests, and at the top of the pyramid, a much smaller number of system and GUI-based testing.
However, this pyramid is often inverted into what we call the ice cream cone. Teams are spending too much time and effort on brittle and complex system-level GUI tests that require fully functionality to be implemented and integrated – resulting in tests that cannot be executed continuously during the earlier stages of the SDLC. The key to achieving successful continuous testing is to melt the ice cream cone and focus on creating automated unit and API tests that can be executed continuously as developers are implementing the new functionality.
The Five Steps to Enable Quality@Speed With Continuous Testing
- Build a foundation of unit tests by automating the process of creating, executing, and maintaining tests. Only by making the work of unit testing easier to create and maintain will development teams adopt project-wide unit testing for all components.
- Avoid relying on late-cycle, brittle, UI-centric testing, that only ends up being the most time consuming and expensive to diagnose and fix. Rather than focusing on automating all the manual testing scenarios, invest in a solid foundation of unit and API testing to make sure the architecture that communicates with the UI is solid in the first place.
- Understand code coverage up-and-down the entire pyramid, along with traceability to requirements/user stories, because without it, development teams don’t really know what’s been tested and what hasn’t. In addition, not understanding test coverage means not knowing what to test at each level of the pyramid meaning that even minor changes require so much testing it bogs down the entire process. See my previous post about change-based testing.
- Shift left with service virtualization of application dependencies, to enable execution of API testing much earlier in the development lifecycle. Increased automation and earlier detection of bugs – including security, architectural and performance defects – is critical to success.
- Accelerate agile development with change impact analysis on a per-build basis to understand the details of the risk that each new iteration has introduced. The analytics provided by change impact analysis are key to making testing focused on only what absolutely needs to be tested rather than the shotgun approach used otherwise. Only through smart, data-based decision-making is real continuous testing viable.
How to Start Down the Path to Improvement
Unsurprisingly, the best way to get started is to review the test pyramid and then evaluate where a project currently stands. Is there a solid foundation of automated unit tests that are run on a per-build basis? Are as many product APIs tested with automation as possible? Is virtualization used? Does testing rely on a complex suite of manual UI tests that can’t be run until the system is almost complete? The path to improvement is based on building a proper test pyramid, automation, and data collection and analytics.
The suggested path to success includes the following:
- Adopt test automation for both test creation and execution and management, expanding the current unit test suite to include as much of the product’s code as is reasonable.
- Use static analysis to analyze the entire code base, including legacy and third-party code, to help detect bugs and security vulnerabilities that testing might miss. Static analysis is also important for enforcing project coding standards.
- Reduce reliance on system-level and UI tests. Although system-level testing is still important and required, it shouldn’t be first. It is also not the time to discover critical architecture, performance, and security issues. Software teams can reduce their reliance on these UI and system tests by building a solid foundation of unit and API tests. By following the other recommendations here, much of the systems should be well-proven before system-level testing begins.
- Leverage service virtualization to allow for automated API testing much earlier in development. Pushing API testing earlier helps discover critical aspects of the system, such as performance and architectural soundness. This is also an important phase for security testing.
- Use data analytics to decide what to test. Focusing the development team on the minimum set of tests to ensure proper coverage at each iteration is the key to bringing agility back to agile development methods.
The numerous pressures on modern software development teams make it difficult to build products on time and on spec. New development methods such as agile development have helped teams focus on getting the right things built for the customer, but projects are still late and error-prone, with testing being a key aspect of development that continues to plague modern development methods. To gain significant improvements, adopt a solid foundation of automated unit tests and perform API testing early and often via service virtualization. And don't forget that testing outcomes improve greatly with the use of data analytics to drive test management.
Published at DZone with permission of Mark Lambert , DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.