Approaching Agile Testing
As many are aware (or should be), "Agile Testing" is not a completely different testing procedure, but actually an approach.
Join the DZone community and get the full member experience.Join For Free
As many are aware (or should be), "Agile Testing" is not a completely different testing procedure, rather a testing approach, which aligns with the principles of Agile software development.
But how? Well, the most salient aspect is that it emphasizes testing and close coordination with the end-users or at least with the story owners, throughout the project life-cycle.
You may also like: 5 Tips for Agile Testing
Additionally in this context, testing gets involved "as early as possible". We often call this the "shift left" approach — where we try to get engaged as early in the life cycle as possible. However, "testing early" works, as long as the development team and available infrastructure provides adequate support by providing successful builds to the testing team each iteration. And this is very easily achievable through a CI/CD pipeline and very bare minimum orchestration.
Agile Testing Evolution
With passing time, as maturity increased, Agile testing has become more integrated throughout each phase of the project life-cycle. We see each feature being "fully tested" as it's developed, rather than most of the testing coming at the end of development.
I've worked in situations where each development team is comprised of quality-control (QC) member(s) at a 3:1 ratio. So for every 3 developers, there is at least 1 member supporting the testing activities. Now, for a strong, experienced and mature team — there is no demarcation across dev and test, so it's just a matter of putting on a different hat and the same member may be acting as a developer initially and a tester later on (or vice versa). But once the team composition is set, there is a need to ensure enough environmental/infrastructure availability for defining the build pipeline.
In the early days — when we didn't have the "DevOps" buzzword around us, we used tools like CruiseControl.Net to run continuous integration (CI) and Continuous Delivery (CD). It helped to not only facilitate good engineering practices but also to ensure that that testing members get a successfully "smoke tested" build to test — even if that is made available over a mirrored development environment (i.e. not necessarily in a test environment). So the testing flowed like this:
Now, picture this above flow repeating itself for every sprint (i.e. every one to four weeks). It strongly ensures that quality comes first at any cost and saves a lot of costs overrun in the longer term. But please note — the diagram above represents just a generic flow, not the actual tests which are being carried out. E.g. during "In Test", it could be an Integration Test running or while in a staging environment, you may be executing the Regression Suites or something different — based on your given context.
Customer Perspective and Beyond
One of the very crucial aspects of testing with an agile mindset is the need to address validation of one or more of the "new functionalities" from not only the customer perspective but a level deeper, and that too during each of these individual cycles as illustrated in the above testing flow.
How? Imagine you — as an end-user have asked for new functionality in your webmail account — which allows you to search across thousands of archived emails. If you get this new functionality and if you find it's working per your expectation — you will be delighted.
That's natural! But while testing, we take it a step further i.e. even before we test we need to ensure — are the core functionalities still working per definition? Am I able to receive, read, send emails? If those are fine — then only we go further to carry out testing on the "new" additions. Typically, such "must-have" functionalities are capsuled in a "Smoke Test" and ideally automated — to ensure we have the basics right before we explore the "should or could have" aspects.
The testing approach will also need to take into consideration regression — sure! But more importantly — how and when to handle the regression before the actual release? It could be too early — which will add too much overhead.
Or if it is too late, we will end up with a huge technical and/or defect debt. Thus, it requires continuous monitoring and testing can no longer stay as a phase; rather, it blends with development, and "continuous testing" becomes the mantra! This is the only way to ensure continuous progress and eventual success.
As is testing is no easy job! To make things even further complex, add the mix of a multi-team/cross-location situation, where new requirements are implemented almost every day and K-LOCs are checked in regularly! This situation demands both ad-hoc and regression testing and even a ten-to-twelve-hour day often seem insufficient. But that may eventually lead to churning among team members. How do we handle this?
- One simple solution could be to increase the headcount — add more resources to manage the testing effort. Perhaps under T andM projects, this would be an interesting proposition. But obviously, the client won't be fond of such an option, and it's not a worthy solution for the vendor either, when in a fixed-price project.
- A better alternative would be to simplify things by streamlining a few processes so that life becomes relatively simpler for testers. A few important considerations:
- Involve testing & testers at the very beginning of requirement finalization, so that testing members get the maximum possible visibility of the requirements.
- Introduce distributed accountability from quality perspective i.e. introduce an in-development test lead, test-case writing lead, story owner, and a business analyst (who works in sync with the stoy owner to define acceptance criteria).
- As the testing team starts working on test-case preparation, involve the customer for review. This helps to ensure the completeness of test cases, and additional review also prevents redundant cases and steps.
- Enforce standard checklist-based acceptance criteria. This could also act as the starting point for the testing team.
- Standardize all nonfunctional quality criteria (e.g. usability, performance, memory usage, security) across the application and get those documented for easy reference. Communicate to Product Owners so that those criteria are truly referred to in each applicable user story.
- Well-defined dependencies should be marked on each user story so that the corresponding test lead can take due measures at testing (or while defining the testing strategy).
- Use lightweight documentation styles/tools (e.g. simply use the "Description" tab of JIRA or TFS work item to define the acceptance criteria, rather than attaching multiple documents) — this not only helps in saving time but more importantly — helps in easy collaboration and document management.
- Capture test scenarios as part of the required item for exploratory testing (one could create a product backlog item, link individual stories, then identify testing scenarios for each).
- Consider having a Testing specific "retrospect" periodically, in which testers across teams meet to align test activities (one major advantage of having all test teams in one room is that the communication between the testers tends to sizzle), share new ideas and best practices.
When to Handle "Regression Testing"?
Generally, under an Agile testing approach, each new functionality is continuously tested as the sprint progresses. Typically, towards the end of the sprint, a small window is kept for a short regression test before moving to the next sprint. Often Agile Teams implement a BVT (Build Verification Testing) routine in which a standard set of verification steps, cutting across the application, are performed to ensure application stability and functionality (or core features). Ideally, this routine should be automated and integrated as part of the CI pipeline to make the release process even more stringent.
As we stated, agile testing is nothing out of ordinary. But how we approach testing with an agile mind — that changes the game altogether. With increasing automation and advancement of RPA, we are looking at not just automation, but automating automation! Now those are all nice and glittering phrases, but the ground rule is — we test it, we modify it, we test it again and the loop repeats itself. We can't test enough, and that is how we ensure quality is delivered to our customers.
Opinions expressed by DZone contributors are their own.