Want Efficient Automation? Get Over the Over-Checking!
Test automation is a key part of most development processes, but is there such a thing as over-checking? Read on to find out more from a different perspective.
Join the DZone community and get the full member experience.
Join For FreeThe following article is a guest post to Zephyr from LogiGear. LogiGear is a partner of Zephyr, who provides leading-edge software testing technologies and expertise, along with software development services that enable our customers to accelerate business growth while having confidence in the software they deliver.
Efficient test automation is often seen as a technical challenge. Organizations believe that selecting a tool is the most important step to make automation successful. Some will understand that adding an experienced automation engineer is also a good idea. In some cases, keywords might be used in the hope that those will make test automation achievable by non-technical people.
These factors help, but in my experience, they are not the key elements for automation success. I don't consider successful automation as a technical challenge. It is a test design challenge. Tests need to be well organized and have a clear scope. Their description using keywords should be at the right level of abstraction, where unneeded details are hidden in those keywords.
One of the aspects of this way of reasoning are the checks. Verifying outcomes of tests are obviously a key reason to have tests in the first place.A tradeoff is that they can also be a major source of maintenance sensitivity. Tests can be unnecessarily impacted by changes in the application under test. Avoiding unnecessary checks is a good idea, even though it is not always readily accepted by everyone.
Apart from being harmful to automation, unneeded checks can also spoil metrics coming from a test execution. Let's say a test has 100 checks, of which only five fit the scope of the test. If those five fail but the other 95 still pass, the result looks like 95% success, while in reality, the success is 0%.
Two typical sources of over-checking are Expected Result columns in management tools, and on-the-fly-checking.
"Expected Result" Columns
Many tools, including even our own TestArchitect, have a mechanism to define tests as a sequence of test steps. The test developer puts the steps in a table, which includes a column for the expected results of the steps. A diligent test developer will typically populate those cells, "why not," and the automation engineer will have no choice than to implement them into checks. A majority of these checks, however, may not be needed in the scope of the test, and become a form over-checking. Use steps sparingly and only specify checks that are really needed.
We prefer to specify tests are sequences of keyworded lines that we call "actions". In this approach, a check gets its own line; with its own keyword as the action name. This way checks are more clearly visible as the result of design decisions, following the scope of the test.
On-the-Fly Checking
"On-the-fly" checking is the habit of adding detailed checks as part of the navigation of higher level tests, something like, "since we're here, let's do a check." For example, the implementation of a "login" action might include a check whether the main window is visible after the login. This sounds like a meaningful check, but in my view, it isn't. Whether the login works should be the target of a separate test, and should be executed separately. Additionally, checking if a login is successful as part of another test can have a negative impact by exposing a test to more impacts of application changes than is needed. Test automation is very much an exercise in the art of minimization; the "less is more" approach is key.
The essence in my view is that each check in a test should carry a burden of proof: does the check fit the scope? The objectives of the test and otherwise the check needs to be reconsidered. In other words, get over the over-checking.
Published at DZone with permission of Francis Adanza. See the original article here.
Opinions expressed by DZone contributors are their own.
Comments