What Are The Most Common Issues Affecting Automated Testing?
Brittleness of the automation process, people, and the lack of processes are some of the biggest issues for automated testing adoption and success.
Join the DZone community and get the full member experience.
Join For FreeTo gather insights on the state of automated testing today, we spoke with twenty executives who are familiar with automated testing and asked, "What are the most common issues you see affecting automated testing?"
Three themes came through in the discussion: 1) the brittleness of automation; 2) people; and, 3) processes.
Brittleness
- Brittleness of automation. CD release cycles are short. If there is a problem with the automation script there will be a failure and you will not know if the failure is due to the site or the script and this puts the release cycle on hold and backs up future releases.
- Except for truly manual work, we could likely automate just about anything. Balancing the ROI of those decisions is always a challenge. We also need to continuously harden the automation – just like you would find defects in a product, you also find defects in the automation used to test that product. Being able to rapidly update automation after it has been added to the software lifecycle can be challenging since the defects in the automation are not known at the time of implementation. Putting it simply, we have script bugs to contend with that show up as flaky or intermittent failures, which can be time-consuming to narrow down.
- Keeping automated tests up to date. If the product is dynamic/Agile, it becomes an effort. The initial investment can be expensive. You need to build on a platform that suits the needs of the company. If the framework is not good, automation becomes a nightmare to support and manage.
- Continuous change in development and maintaining your automation can affect your ability to fully automate tests.
- The real challenge with test automation is not the setup of the automation. The real challenge is setting the test up in a fashion that will allow you to cope with the maintenance challenges that come with automation. There’s a phenomenon called the Maintenance Trap. It’s the challenge of test automation. It can only be addressed for comprehensively solving the issue by taking care of the automation, test data management, and service virtualization to provide a stable infrastructure. The automation needs to minimize the maintenance efforts you need to make. This can be achieved through avoiding redundancy in the technical layer of automation but also in the business layer of automation. In short, recurring procedures should be defined only once in your test portfolio but linked to test cases easily.
People
- People not believing manual tests can be automated. Poorly written tests are always a challenge. Timing sensitive. Environment sensitive. Poorly written tests can be worse than no test at all. Not paying attention to test performance. Code bases with little unit testing and heavy manual testing are troublesome. Slow to run versus unit tests.
- People are the number one roadblock. Not willing to change. Get the team excited about the outcome and let them see how quickly their code gets to production.
- Clients without the vision, knowledge, or resources to realize what can be done to engage and retain customers. We’ll train customers or provide managed services.
- XAS or Angular frameworks don’t have a good product for test automation. Test at a JavaScript level to get locators. Due to lack of automated testing, a lot of companies rely on manual – the switch is a huge mindset change. The web is a completely different beast. Culture is the issue.
- Under-investment in automation infrastructure and investment in resources to work down automation backlogs while staying on top of incoming automation demand.
Processes
- Depends on where they are on the spectrum of maturity. New companies just need to get started testing, gain confidence in the results and have a repeatable process. More sophisticated teams are looking at how to make test reusable so 10 automation engineers aren’t writing the same tests. They are also using open source solutions like Selenium to run at scale in parallel.
- Defining KPIs and pass/fail criteria for performance testing. If we do not have accurately defined KPIs to hit then we will never write useful automated tests. This could extend into every level of automation too.
- When we moved to Agile in 2012, we had a six-year backlog of things to test with automation. See in greenfield. Think differently about problems and how to test. You cannot get to Agile without repeatable processes. Cycle times are minutes for start-ups versus monolithic taking weeks, months, and even years.
Other
- Technical issues. Managing test data, multiple issues with storage space, compliance, exposure of data (GDPR), how to manage, secure, and provision data at the right time and right location.
- External dependencies. An exact test can scale outside the virtual environment. Know what to test and what to exclude – identify and document the process. Avoid running into unknowns in the final test. A firm believer that there is always a way to accomplish something – and account for it.
- Lack of proper infrastructure.
- Non-Deterministic Tests – data isn’t controlled well enough to avoid flakiness. Improper planning and scripting – not testing the most effective area. Barriers to entry - lots of different tools requiring varying degrees of proficiency to utilize effectively.
What are the most common issues you see affecting automated testing?
Here’s who we talked to:
- Murali Palanisamy, EVP and Chief Product Officer, AppViewX
- Yann Guernion, Director of Product Marketing, Automic
- Eric Montagne, Technology PM, Barclaycard
- Greg Luciano, Director of Services and Amit Pal, QA Manager, Built.io
- Donovan Greeff, Head of QA, Currencycloud
- Shahin Pirooz, CTO, DataEndure
- Luke Gordon, Senior Solutions Engineer and Daniel Slatton, QA Manager, Dialexa
- Anders Wallgren, CTO, ElectricCloud
- Charles Kendrick, CTO, Isomorphic
- Bryan Walsh, Principal Engineer, NetApp
- Derek Choy, V.P. of Engineering, Rainforest QA
- Subu Baskaran, Senior Product Manager, Sencha
- Ryan Lloyd, V.P. Products, Testing and Development and Greg Lord, Director of Product Marketing, SmartBear
- Christopher Dean, CEO, Swrve
- Wolfgang Platz, Founder and Chief Product Officer, Tricentis
- Pete Chestna, Director of Developer Engagement, Veracode
- Harry Smith, Technology Evangelist, Zerto
Opinions expressed by DZone contributors are their own.
Comments