The Future of Automated Testing
The Future of Automated Testing
DZone's Tom Smith talked to 31 industry executives to learn about using AI/ML to improve the automated testing process on several fronts.
Join the DZone community and get the full member experience.Join For Free
Download the blueprint that can take a company of any maturity level all the way up to enterprise-scale continuous delivery using a combination of Automic Release Automation, Automic’s 20+ years of business automation experience, and the proven tools and practices the company is already leveraging.
To gather insights for the current and future state of automated testing, we asked 31 executives from 27 companies, "What’s the future of automated testing from your point of view - where do the greatest opportunities lie?" Here's what they told us:
- 1) Distributed systems traceability as part of the testing cycle. How does the change in one component affect the behavior of other things? See this in cloud-native architecture. Quality of the code versus the quality of the network. Measurability of system behavior. 2) Systems that can leverage ML since there are more things to keep up with than a human can see. Compare to previous behavior and other known systems.
- Testing is designed for AI/ML to build predictable models from patterns. An AI bot creates log-in tests and then goes throughout the SDLC. Today CI/CD and streamlining long-term, this will be very commonplace. Testing will be as natural as writing code which will be done by machines. Developers will be working on architecture, user interface designs and interactions, and writing code that works. The focus will be on creating experiences versus outcomes. Software engineers today are focused on building a drill. In the future, they’ll be focused on hanging the painting.
- Quality will become part of development. The only way to move faster is to build quality into how we develop. Companies who have already figured it out are doing it. There is still a vacuum about how the process is managed. Tools and vendors are aging and exiting. The analytical capability of everything that’s happening and enabling to ability make quick, well-informed decisions. AI/ML are part of the solution as teams generate more data. Companies are struggling with managing large amounts of data. Another trajectory type of tech being automated. IoT is a new space with rudimentary DevOps processes. Not enough standardization in technology to fully-automate everything. This will transform over the next few years.
- Only areas we’ve moved are functional and unit. So much non-functional – performance testing, security testing, acceptability testing with more litigation happening to comply with acceptability regulations. Solutions coming to market addressing non-functional areas. Excited about conversational interfaces (NLP), but that is further out. Manual testing initially but automated as they catch on.
- The greatest opportunity for automated security testing is machine learning. Security teams that are able to leverage their historical vulnerability data to train machine learning models and will be able to automate the vulnerability verification process, thereby providing developers accurate vulnerability data in near real-time.
- Expand automation beyond test execution with AI/ML. Thousands of tests per day automatically generated and dynamic. Bringing users into testing with monitoring, GA, usability, Optimizely.
- To make it not like a stepbrother after you completed development. Something that’s embedded in every step of development and delivery so it’s part of the overall process. Seamless interfaces in the form of SDKs. Stacking to development to unit testing, the end-to-end testing, to deployment. Saves time and money while improving quality. Visual AI is key because that’s a common language everyone understands and can relate to.
- Using AI to run every test imaginable in the least amount of time to ensure your code is always vulnerability free.
- In the area of machine learning. Go from scenario from data analyst to updating model to model that updates without coding skills. Very specific machine learning targeted instances. More focused ML function to look at network scenarios. We’re two years away. Rather than network validation, we'll be able to see how a system adapts to new patterns.
- Smart reporting, including smart analysis, has already begun by incorporating machine learning into the automation test results. So, instead of just conveying what went well and what did not, the report can analyze and then describe the hows and whys, based on previous occurrences. Maybe the bugs are only in the QA environment, for example, and not in production. This will continue to progress in the near future. Also, with machine learning, automated testing can determine which build is the best one to release in your production environment. We already see some companies using these methods by having their builds compared automatically using AI/machine learning - where there’s a scoring system set up, and the “winner,” i.e. the best build according to the automated tests, is the one deployed. We believe that this will become more mainstream moving into the future.
- Models get smarter with more data. Hybrid with humans understanding business context. AS ML learns the nuances of businesses, applications will be able to anticipate and solve problems. Need to point tools in the right direction.
- 1) Software testing matters more today than in the history of development. Software failure = business failure. 2) The notion of AI has invigorated the industry to be more creative in approach and using the data exhaust that is produced to make testing easier and more automated.
- Testing requires some members of the team to have competency with testing in their toolchain. Tests will become easier to use. Tools that use human heuristics to automatically test the application like a person would. New design patterns for testing as ML evolves. More efficient and effective test at a lower cost. Low code reduces the barrier to entry.
- 1) Tools are consumable and integrate with the wider ecosystem. Avoid vendor lock-in. 2) Intelligent automation with creation and maintenance of tests. Self-healing tests. 3) Automation testing for all personas within the organization.
- 1) Continuous testing as a service. 2) With AI/ML more predominant testing will be a fast mover in how to leverage insights from testing. Results in self-healing, self-remediation.
- Self-healing. Automation can be fragile and break as changes are made to the code or the environment. Organizations do not fully realize the ROI of automated testing the leaders expected because of the fragility of the tests. We will be able to leverage AI/ML to self-heal automated tests that used to work without an engineer having to come in and make changes. See tests that have run successfully and see the pattern versus when it fails.
As companies go from monoliths to microservices they are able to isolate parts of the system, so they can test in absence of everyone else.
- 1) More tools to test the entire stack. Tools encompass more than traditionally seen. 2) Starting to see new languages with great built-in testing frameworks. Expectation the language will have the tools you need.
- Get to the point where we have support for what do I need to run based on the change I just made. Architect for testability. Loosely coupled architecture helps with this. Able to manipulate where output goes helps with testability. Have as few integration tests as possible, cover with unit tests.
- Cloud/grid-based testing solutions have a lot of potential. Their main weaknesses right now are: 1) Limited extensibility. Some organizations are going a good job addressing obvious points of extensibility like integration into CI. Right now, deeper extensibility like capturing and storing custom diagnostics is not really on the radar. Our own test suites have the ability do things like make a request to the server to capture server logs or capture dumps of the last few network requests and responses or capture the entire visual component tree. These are extremely important to us, so we can’t yet move to cloud/grid solutions for most of our test suite. 2) Terrible tools for record/playback. Visual test recording makes a lot of sense, but tools like Selenium IDE are borderline worthless without adding a lot of extensions: the visual tests are extremely fragile, timing-dependent, and saved as horrifically bad code that no one would ever want to extend or modify. All of these problems can be solved, and we’ve solved them for our framework, but Selenium HQ and/or cloud/grid providers need to focus on making visual recording a feasible approach, because it would be extremely compelling to just go to a website and start defining an automated test suite by point and click navigation through your application.
- Automated testing combined with easier access to virtualization and cloud infrastructure allows IT organizations to gain much faster feedback about their system’s quality than ever before. Execution will become much more important. We are already seeing an increased demand on test execution infrastructure. One of our biggest clients in the US, Symantec, is using hundreds of our runtime licenses for parallel test execution in the cloud. It may seem counter-intuitive, but the more convenient and consumer-friendly software products become – the more complex they are to test. Back in 2011, we were one of the first tools in the market supporting automated testing on mobile devices. Back then and still it is one of the biggest challenges for automation tool providers to deal with closed systems like Apple. Definitely, the next big thing is how to test the world of IoT. We believe that automating integration and system-level tests in this field is only possible by mocking or simulating microservices in the backend.
- It will become automated with input and output. No need to write and integrate code.
- I'd like to see more organizations practicing test-driven development. Invite the creativity of devs to automate anything that needs to be automated. See a lot of test coverage analysis. Business logic becomes self-contained. Test coverage 2.0. Immutable, software-defined infrastructure. What does test environment look like? What does deployment environment look like?
- Automated testing toolsets will see a lot of interaction with the developer world. Newer tools allow for automation beyond just writing code. There's an open door for testers and developers to interact more deeply. Nonfunctional requirements to give testers the ability to test code without writing code with APIs.
- A lot of open source tools are coming up, more companies adopting fully-automated testing.
- I see a clear evolution of reducing the amount of end-to-end testing performed in these kinds of distributed systems in favor of monitoring.
- I think the greatest opportunities for automated testing comes down to this notion of helping people move to a world of continuous delivery. That is the SDLC, the software development technique, that companies need to be successful in building applications that meet the needs of a changing user risk. The most interesting thing about that is where feature management plays a role in the adoption of automated testing or continuous integration testing is that continuous integration testing is great for starting to move faster. And any time you start to move faster, you need to have the right protections and infrastructure in place to make sure you're doing it safely. This is the safety net—or the seat belt—for your software development.
- Gil Sever, CEO and James Lamberti, Chief Marketing Officer, Applitools
- Shailesh Rao, COO, and Kalpesh Doshi, Senior Product Manager, BrowserStack
- Aruna Ravichandran, V.P. DevOps Products and Solutions Marketing, CA Technologies
- Pete Chestna, Director of Developer Engagement, CA Veracode
- Julian Dunn, Director of Product Marketing, Chef
- Isa Vilacides, Quality Engineering Manager, CloudBees
- Anders Wallgren, CTO, Electric Cloud
- Kevin Fealey, Senior Manager Application Security, EY Cybersecurity
- Hameetha Ahamed, Quality Assurance Manager, and Amar Kanagaraj, CMO, FileCloud
- Charles Kendrick, CTO, Isomorphic Software
- Adam Zimman, VP Product, LaunchDarkly
- Jon Dahl, CEO and Co-founder, and Matt Ward, Senior Engineer, Mux
- Tom Joyce, CEO, Pensa
- Roi Carmel, Chief Marketing & Corporate Strategy Officer, Perfecto Mobile
- Amit Bareket, CEO and Co-founder, Perimeter 81
- Jeff Keyes, Director of Product Marketing, and Bob Davis, Chief Marketing Officer, Plutora
- Christoph Preschern, Managing Director, Ranorex
- Derek Choy, CIO, Rainforest QA
- Lubos Parobek, Vice President of Product, Sauce Labs
- Walter O'Brien, CEO and Founder, Scorpion Computer Services
- Dr. Scott Clark, CEO and Co-founder, SigOpt
- Prashant Mohan, Product Manager, SmartBear
- Sarah Lahav, CEO, SysAid Technologies
- Antony Edwards, CTO, Eggplant
- Wayne Ariola, CMO, Tricentis
- Eric Sheridan, Chief Scientist, WhiteHat Security
- Roman Shaposhnik, Co-founder V.P. Product and Strategy, Zededa
Opinions expressed by DZone contributors are their own.