I wish the term “DevQAOps” sounded as cool as “DevSecOps.” But alas, DevQAOps is hard to say, and DevOps is the term that IT professionals have come to know and love today.
Unfortunately, the term DevOps leaves one key component of software delivery out of the mix: Quality Assurance, or QA. QA is an essential part of DevOps, but it sometimes gets short shrift in the DevOps conversation.
To help give QA its due, in this post, I’d like to point out all of the ways in which QA teams play a vital role in DevOps and continuous delivery workflows. I may not be able to rewrite the DevOps lexicon, but I hope at least to get people to think more about how important QA is in our DevOps-centric world.
QA Shifts Both Left and Right
With DevOps, Dev teams look to shift right, assuming responsibility for the reliability of the applications they build, and not leaving it fully to Ops. Similarly, Ops shifts left in an attempt to influence the features and ideas that are spawned in planning and development cycles. However, between these two, QA teams shift both left and right, and are the key enablers for Dev and Ops teams to collaborate effectively. QA teams have a role in defining which features are prioritized by Dev, and ensuring code is of high quality when deployed.
This high view of QA takes not just QA seeing itself differently, but Dev and Ops teams seeing QA differently, too. Often, QA takes the blame for bad development practices, and any faults that show up post-release. Developers take responsibility for the code they ship. Mark Hrynczak of Atlassian, talking about the future of QA, insists that a great developer is someone who “can capably write and ship high-quality, bug-free code,” and doesn’t insist that QA should catch all their mistakes. The biggest change that DevOps brings to QA is that quality is everyone’s problem, not just QA’s.
DevOps Quickens QA
DevOps involves continuous integration and continuous delivery. They both break the entire software delivery lifecycle down into small chunks that are easier to manage, are limited in scope, and are released more frequently. The pace of development has been quickened. Ops is having to cope with multiple releases every day. All this busyness on either side means that QA also needs to work at a much faster pace. Common issues between QA and other teams—like “works on my machine,” keeping up with version changes, and unclear product specs are all compounded with DevOps.
Enabling QA teams to perform in this new culture requires a completely new approach to testing, which also means a different toolset. It takes test infrastructure that’s agile, and reliable. That’s the problem that Docker and the container ecosystem is solving today. While Docker shot to fame among Developer circles, QA took some time to catch on. But now, Docker’s value in powering testing infrastructure is clear. QA is no longer on the backfoot when it comes to container adoption.
Containers - Consistency Across the Pipeline
DevOps is about maintaining consistency across every stage of the development pipeline. Previously, Dev would throw code over the wall for Ops to deploy, and Ops would often push back, saying the code is not deployment-ready. The solution to this deadlock is to have QA enforce quality right from the start.
For Dev and QA to speak the same language, they need to be able to easily share all the configuration needed to run an application. The state of an application when an error occurs is important when troubleshooting. This was done manually in bug tracking systems in the past. Now, with Docker, Dev and QA can easily share configuration and the state of an application in a Docker image. The image functions like a snapshot of the application.
This consistency doesn’t just help with troubleshooting. It also helps with improving the reliability of an application, as the same container image that’s built and tested is deployed. This brings confidence at every step of the software delivery lifecycle, and improves user experience.
However, containerizing applications and infrastructure is not simple. While Docker provides the standard container runtime, there are bigger concerns like replication, failover, and automated deployments. This is where an orchestration tool like Kubernetes has a key role to play.
Container Orchestration for Reliable Deployments
Kubernetes makes it easy to create and manage multiple environments for test, staging, and production. You could run, for example, test and staging environments in the same cluster and isolate them using Kubernetes’ namespaces feature. The benefits are more simplified management, reduced costs, and better resource utilization. The end result, though, is faster and better QA processes.
To keep up with the fast pace of development, you need to automate builds, testing, and deployment. Leaving automated testing out will only lead to unstable, low quality releases. Luminis Technologies adopted Kubernetes for container orchestration and saw big improvements in reliability. This is because of how easy Kubernetes makes it to test and ship containers.
Running staging and production environments in the same cluster, however, is not ideal, as it creates a whole new set of challenges to ensure the staging environment doesn’t starve production for resources. Still, the benefits of Kubernetes for test infrastructure are hard to overlook.
Discussions around DevOps have left QA out of the picture, even though QA is essential to faster and higher-quality releases within a continuous delivery pipeline.
QA is in a unique spot where it shifts both left and right to align with Dev and Ops teams to build in quality, reliability, and sustainable speed from the start. To make this happen, QA needs a new kind of infrastructure that’s powered by containers and managed by container orchestration tools. Docker and Kubernetes together give QA teams the wings they need to take flight in this new world of DevOps.