Over a million developers have joined DZone.

It’s Not DevOps If You’re Still Writing Your Tests By Hand

Unless the root causes of testing bottlenecks are resolved, the ultimate goal of delivering quality software which reflects constantly changing user needs will not be achieved.

· DevOps Zone

The DevOps zone is brought to you in partnership with Sonatype Nexus. The Nexus suite helps scale your DevOps delivery with continuous component intelligence integrated into development tools, including Eclipse, IntelliJ, Jenkins, Bamboo, SonarQube and more. Schedule a demo today

Since at least the early nineties, tools have existed to assist the management of projects and requirements, and the tests needed to successfully deliver them. More recently, these tools have focused on agile or lean production methods, and DevOps has in turn emerged to automate more of the delivery pipeline.

However, the essential approach to testing often remains the same within these frameworks and  the usual bottlenecks often persist. Unless the root causes of these testing bottlenecks are resolved, the ultimate goal of delivering quality software which reflects constantly changing user needs will not be achieved.

“Garbage in, Garbage out”

The first challenge is knowing exactly how a system is expected to function and what needs to be tested.

Testers often now face a constant barrage of user stories and change requests, stored in disparate file types. These are usually gathered unsystematically, meaning that they are generally incomplete and testers have little chance of connecting up the dots into a coherent system.

Requirements further tend to rely on ambiguous natural language, which is far removed from the logical steps of a system which need to be developed and tested. This creates a “garbage in, garbage out” scenario where all the automated DevOps pipeline accelerates is defects created by misunderstanding.

Testing Remains Too Slow, and Too Manual

The requirements are further stored in static formats, from written user stories to static diagrams and process models. This prevents an automated or systematic approach to deriving test cases and test plans, which instead must be created manually.

The first issue with this approach is quality. Even a simple system is likely to have thousands of possible paths through its logic, and manual creation only typically manages to cover a fraction of the system under test. In fact, our audits of manually created tests have found just 10-20% test coverage to be the norm, leaving the majority of a system exposed to defects.

The other major issue is time. Writing up test cases in a linear fashion is highly labour intensive and repetitious, and one team we worked with spent 6 hours creating 11 test cases with just 16% coverage. Automating test execution won’t avoid these delays either, as it introduces the effort of manually scripting tests, and does nothing to increase coverage.

Testing Cannot Keep Up With Change

As test assets are manually translated from the requirements, they are not traceable back to them. Change then creates arguably the greatest bottleneck in a DevOps pipeline, as constantly changing user needs must be manually reflected in the test cases.

In practice, this usually involves checking and updating every existing test case. Sometimes tests are simply piled up in an effort to retain coverage, but this leads to rampant over-testing, while “burning” every test case and starting again is not possible given the rate of change.

Dependency analysis is a further issue when faced with unconnected, static user stories, and there is typically no reliable or automated way for testers to identify the impact of a change across inter-dependent components. This is especially problematic when faced with technical debt, and numerous high profile outages caused by ‘routine’ updates reflect the real threat this poses.

Testing Within a State of the Art DevOps Framework

Traditional testing techniques are evidently not suited to DevOps, and testing needs to become more automated and more systematic if it is going to keep up with changing user needs.

Model-Based Testing offers one way to achieve this, as it enables testers to automatically generate optimized tests directly from formal representations of the requirements. This avoids “information hops”, while coverage is maximized and the tests can be updated automatically to reflect changes made to the requirements.

To see this approach in action, please watch Using HP ALM within a state of the art DevOps framework. The webcast, featuring Philip Howard, took place on April 27, at 11 a.m. EDT/4 p.m. BST.

The DevOps zone is brought to you in partnership with Sonatype Nexus. Use the Nexus Suite to automate your software supply chain and ensure you're using the highest quality open source components at every step of the development lifecycle. Get Nexus today

Topics:
production ,methods ,language ,user stories ,pipeline ,scripting ,devops

Published at DZone with permission of Tom Pryce, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

The best of DZone straight to your inbox.

SEE AN EXAMPLE
Please provide a valid email address.

Thanks for subscribing!

Awesome! Check your inbox to verify your email so you can start receiving the latest in tech news and resources.
Subscribe

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}