Automated Testing Considerations
The keys to using automated testing to improve speed, quality, and testing are all over the place.
Join the DZone community and get the full member experience.Join For Free
To understand the current and future state of automated testing, we spoke to 14 IT professionals intimately familiar with automated testing. We asked them, "What are the keys to using automated testing to improve speed, quality, and security?"
We received a wide variety of answers from our respondentss with little to no agreement. Having conducted a number of these interviews, this lack of coalescence around a few items reflects a lack of best practices and standards which, in turn, are hindering the widespread adoption of automated testing for code and applications.
Here's what the respondents told us:
- Pursue a model-based approach to testing and use AI to auto-generate tests on areas that really matter. Reduce the amount of time needed to create a script. Most companies are still writing scripts manually. In order to do in-sprint testing, you need a model-based approach to auto-generate tests, get impact analysis, self-healing, and focus on what really matters. Use automation to pursue a risk-based testing strategy. Use neural networks to look at the tests you run, their key attributes, and what correlates with failure, to identify risk vectors. It’s necessary to accelerate test development, deployment, and analysis. Identify the level of confidence you want to achieve. Use algorithms to auto-generate tests.
- The financial sector has been working on security for a long time, and other sectors are catching up. Immature programs are building from the ground up. There is a lot of reliance on third-party tooling from open source and commercial providers. Trained more around penetration and manual testing. We recommend building more of your own test suites and harnesses. Use a positive validation approach, understand security validation policy and build tests. Write code understand application architecture and work with development teams to write tests in the vein of DevOps more collaboratively with security invested in the overall product. Build a bridge with the development team. Take more ownership and share responsibility with security.
- The key to using automated testing to improve speed and quality lies in customizing solutions to industry sectors. When it comes to automated testing, there is no blanket fix. Companies must leave behind the “one size fits all” mentality and begin at the core of their industry needs to succeed. Know what needs to be tested, why, and how often before implementing automated testing – for example, a solution used for the travel industry should not be applied the same way to an insurance company. Quality results stem from automating the right processes in the right way.
- Move away from classic UI automation and move toward API unit-level automation. UI is way too slow for execution and building. Agile needs to test what’s being built within the sprint. Testing needs to be automated at an API level. Test within the DevOps process – automated, and fast. Automate at the right level. Use automation frameworks that are efficient and easy to maintain. Also, have the right data strategy with the right people and the right skills.
- It depends on the maturity of the company engineering and SDLC processes. Sometimes we consult with clients with no repeatable manual tests. We begin by hooking up with manual testing to shore up dependencies. Companies with consistent repeatable practices start with good tests. Do things right in a way that will scale. Review the manual test case suite. Look at existing test cases, coverage, and refactoring in a way they automate. Decouple the elements of the test in the test suite. Model page options properly. Try not to entangle things together. Two parts to speed 1) release velocity; and, 2) how fast tests can run. There is also a maintenance component. Think about how to update the test case and stay up to speed. Decouple components, make tests small and Automatic. On the quality side, you need to quantify quality build over build and release over release. Collect metrics on test cases – performance, speed, and failures. Measure tests collecting analytics and storing data so you can analyze and build insights and munge with other data the business is collecting to drive insights. Look at turning testing into a pseudo-revenue generator. Balance the breadth and depth of test coverage.
- People are looking for improved QA in general to achieve a good balance with speed and quality. Identify the best strategy to deploy fast with quality. While automation makes sense, you have to be careful and question whether or not everything should be automated. More people are realizing automation is the way to go. However, things fall apart if people go into automation without a strategy behind it. You need to consider what to automate and what not to automate. A second problem is there no measurement with regards to what automation provides and how automation helps the business. Many leaders don’t have a way to measure and report the benefits. You need to quantify the amount of time a developer spends on automation to determine if the value is being realized. Evaluate what should be automated and what is suited for alternatives like third-party solutions.
- As software development teams are increasing the rate at which they deliver software, test automation assists them in delivering with speed and quality. New features are being built and are being tested by developers and testers, but that testing is typically focused on the new feature itself – not the newly changed product as a whole. Automated tests hold the team’s expectations of the desired behavior of their application. Being able to execute these tests enables the team to move faster with greater confidence.
Here’s who shared their insights:
- Drew Horn, Senior Director of Automation, Applause
- Angie Jones, Senior Developer Advocate, Applitools
- Isa Vilacides, Director of Engineering, CloudBees
- Himanshu Dwivedi, CEO, Data Theorem
- Antony Edwards, COO, Eggplant
- Kevin Fealey, Senior Manager Application and Product Security, EY
- Hans Buwalda, CTO, LogiGear
- Malcolm Isaacs, Senior Solutions Manager, Micro Focus
- Madan Mohan, Global Head of Travel and Transportation, NIIT Technologies
- Jared Go, CEO, OhmniLabs
- Derek Choy, CIO, Rainforest QA
- Nancy Kastl, Executive Director of Testing Services, SPR
- Rishikesh Palve, Integration Product Manager, TIBCO
- Ray Wu, CEO, Wynd
Opinions expressed by DZone contributors are their own.