Over a million developers have joined DZone.

Isn't it Time That Testing Makes a Shift Right?

DZone's Guide to

Isn't it Time That Testing Makes a Shift Right?

With the advent of consumer applications, it was quickly recognized that traditional development practices would not work.

· Performance Zone ·
Free Resource

Sensu is an open source monitoring event pipeline. Try it today.

With the advent of consumer applications, it was quickly recognized that traditional development practices would not work. The linear stages of the Waterfall model could not keep up with either the changing user needs, so that by the time software had fallen over the waterfall, it was either far removed from what the user actually wanted, or another company had got there first.

That’s assuming the user needs had been accurately captured in the first place, which was not common. Users frequently had to settle for second best software which did not accurately reflect their needs. Today’s experience-driven consumers, by contrast, will simply look for a better alternative offered by another vendor.

Redefining the User Experience

Interest in “Agile” methodologies has continued to rise in part due to the ability to engage the user earlier and more frequently, while becoming more reactive to change. Any misunderstanding of the users’ desired functionality could then be identified earlier, working iteratively to deliver software which reflects accurately the changing user needs.

In addition to more regular opportunities for user feedback, the Apps themselves are producing more data than ever before, which can be analyzed to identify opportunities for improvement in the software. Meanwhile, DevOps is bringing tools across the entire development lifecycle into closer alignment, so that these large amounts can more easily be collated and analyzed.

As a result, analytics has become an increasingly popular and effective aid to decision-making in software development. This topic is tackled in the latest of a series of articles written by testing guru Paul Gerrard.

Testing and Analytics

As Paul notes, analytics does not affect the fundamental goal of testing, which still gathers and analyses information to inform decision-making. A difference in the discipline which Paul calls “Test Analytics”, however, is that it analyses the entire lifecycle, and so extends beyond pre-production testing.

Test Analytics places an emphasis on application monitoring, where the data produced by pre-production testing is analysed, but also the data produced within production and operations. The results are then fed back in to support testing’s “core product”: the information passed on to stakeholders.

“Shift left, shift right”

To extend beyond pre-production testing, testing needs to “shift right” as well as left, leveraging the insights of production and operations and reflecting these back in test cases and the system requirements.

One way to do this, is to leverage the large amount of data which users input into a live system and the large amount of data which comes out. This data can be analyzed, for example using association rule-learning, reverse-engineering how users are using a system in practice.

One possible application of these insights in testing is knowledge elicitation. The insights from operations might reveal previously unknown or unknowable aspects of a systems logic, which can then be reflected in more accurate, comprehensive tests. If Model-Based Testing is used, these insights can be reflected iteratively in the model, generating an updated set of tests which more closely reflect the system under.

Another application is in Risk-Based Testing. Live systems data can be analyzed to identify the relative likelihood that certain areas of a systems logic will be exercised during operations. Tests can then be generated to satisfy empirically defined risk-thresholds, maximizing the likelihood that defects will be detected and eradicated before the user finds them. This is especially valuable when testing is pressed for time and resources.

Analytics Informs Decisions and Validates Them

Analytics can be used to support more informed decision making about the direction of software, but can also be used to iteratively define tests which more accurately reflect the desired user experience. Application monitoring and analytics are therefore invaluable tools for both decision-making and for assuring that those decisions have been successfully implemented in development. They can enable organizations to stay ahead of the game, delivering applications which reflect constantly changing user needs.

Sensu: workflow automation for monitoring. Learn more—download the whitepaper.

testing ,application monitoring ,practices ,performance

Published at DZone with permission of

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}