Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

How Automated Testing Is Changing

DZone's Guide to

How Automated Testing Is Changing

We are seeing increased adoption of automated testing with the maturation of Agile methodologies and adoption of DevOps culture. Read the interview highlights for more.

· DevOps Zone ·
Free Resource

Easily enforce open source policies in real time and reduce MTTRs from six weeks to six seconds with the Sonatype Nexus Platform. See for yourself - Free Vulnerability Scanner. 

To gather insights for the current and future state of automated testing, we asked 31 executives from 27 companies, "What are the most significant changes to automated testing in the past year?" Here's what they told us:

Adoption

  • Evolution plus shift in recognition that testing is strategic to the company’s mission. As important to the success of the software as the software itself. Shift left – everyone now knows what it means. More opportunities to take advantage of automation. Making SDLC and DevOps pipeline more effective. 
  • Automated web testing has been gradual. Move from a large number of customers in tech to more financial services, retail, media. Bigger change as automated mobile app testing has come of age. This is a function of the proliferation of platforms. Content viewed through mobile apps has really grown. Need to provide a quality experience and refresh our apps to provide the best UX. 
  • Inflection with CI the more I automate the faster I can get feedback to developers. Disruption happens when bugs come after a week or month it’s disrupted. Adoption is growing. All tests that are written are being used by developers and more as people rely on automated testing. 
  • Trying to gradually adopt more automated testing. Added to unit and system tests. Flow tests cover functionality. Slight modification to the system test. More coverage of the code doesn’t miss functionality. Cover greater number of APIs. 
  • Massive adoption of AI and applying in production at scale. Massive need to test algorithms, continue to perform over time. Machine learning is the high-interest credit card of technical debt. Algorithmic trading, hourly high-frequency trading. Insurance takes ten years to adopt new technology and they're even automating. 
  • We believe that the most significant change in the past year is the understanding that automated testing is mandatory for quick and stable R&D processes. The automated testing can focus on regression testing (full coverage) so that QA’s manual testers can focus more on new features and progression testing. 
  • Recognize there is a push to be more automated. Can’t scale if everything you’re doing is manual. More organizations willing to ask development teams to automate tests tailored to a particular application. More is being put on the developers to ensure things are working. As companies move to DevOps, developers may be notified in the middle of the night that their code failed and expected to fix immediately. This improves code quality and justifies good testing. 
  • In the past year, I’ve noticed there are more automated testing platforms that can support containers. This helps companies that are looking to move to more of a microservices based architecture, they can bring their automated test sweeps with them. Also, in the past year, the number of people leveraging automated testing in the form of continuous integration has increased. They are testing in this modern application development style, as opposed to old-school automated testing like silk tests or some other static test.

Agile/DevOps

  • There continues to be slow adoption by companies that “get it.” Competitor’s toolsets do not perform and slow the pipeline and hinder adoption because they’re not friendly to DevOps. Must deliver speed, accuracy, and integration or companies and teams will be slow to adopt. 
  • Perhaps the most significant change to automated security testing in the past year is the maturity of Agile methodologies and the adoption of a DevOps culture. The mean acceptable turn-around time for results from a security scan has dramatically decreased as a result of these changes. Imagine if your organization deploys code to production ten plus (10+) times a day? Waiting even 24 hours for the output of an in-depth security scan will only slow the developer’s progress. 
  • More and more, agile-driven teams are no longer separating test automation from development. Especially when it comes to web application development, teams tend to create their own test automation frameworks based on open source tools. Although it’s easy to go this direction for startup organizations, it requires a lot of discipline and more technically-skilled people on the team to manage the work of development and test automation. Unfortunately, the development team leads might hire and prefer people with strong coding skills over less technical people. The risk is these teams may end up lacking testing expertise. There is a reason behind the old rule that developers should not test. 
  • Momentum with respect to DevOps. Need to automate people, process, and technology. Rather than testing being a specific time box, make it everyone’s responsibility. Functional UI testing on only 5% the remainder should occur across the SDLC. Create test cases while planning. Speed up developer efficiency by automatically creating the right test data at the right time. Don’t compromise or have any security breaches. The emerging concept of security testing -- how to embrace DevSecOps.

Complexity

  • The inflection point was two or three years ago. Moving away from manual QA by clicking GUIs to software development test engineers (initiated by Microsoft). The number of distributed components and microservices has exploded so you must be automated to handle the volume. Test automatically and test together. Engineering needs to be accountable for the quality of the product. Handoffs that are manual create a bottleneck. 
  • Organizations are realizing as complexity increases, automation is the only way to keep up with all the changes. 
  • Gradual adoption over the last few years. Using people or bring in consultants did not succeed. Over the last six years, we sent testing to China or India – it did not scale. Complexity beyond human capacity. Looking for automated modeled approaches to work. Still in the mid-stages of getting them working for everything. AI/ML is hype so we’re evolving from actually having the technology to achieve the objective.

Toolsets

  • The continuous evolution of tools and frameworks makes automated testing easier to deal with. Less necessary boilerplate today. A number of SDKs for monitoring without a lot of work – the exception are tracing tools. You can’t automate away a lot of what’s required. We provide specific testing and monitoring for video streaming – we enable clients to see data change with changes in the platform. We enable developers to have flags in production software to compare before and after -- A/B tests for developers. 
  • Two things: 1) Selenium 3 release. They really haven’t added much, but the whole industry now has to deal with various backward compatibility issues, so this will be a major theme this year. 2) Cloud/grid-based Selenium testing is becoming increasingly mature and popular. It now makes sense for many organizations with lighter requirements around automated testing. We are hoping to see significant improvements in the extensibility of such offerings over the coming years. 
  • Enablement tools, Docker, integrate automatically, communicate with each other, build test quickly. Ensure integration tests go smoothly. Used to be manual and difficult to scale. Now tools have the maturity to run tests easily while supporting all configuration. 
  • 1) Practices and theories being adopted – shift left. Teams are really making an effort to shift left. Getting testers involved early. Devs are writing tests beyond unit and service to include UI. Tools that can work in their environment being adopted faster. 2) CI/CD people are getting it right with open source and building their own CI tools to fit build processes. 3) New concepts and technologies automating the automation. Do more, faster. 
  • We’re seeing new trends. 1) Codeless automation with less code and scripting. The revival of Selenium IDE ability to record and playback. More powerful tools. 2) People are trying to find test automation tools that are easier for developers to use and shift left. Work on developer’s frameworks. JavaScript that works better with developer framework. 3) Overcome slowness and bugs in Open Source with Cypress and Puppeteer – faster, more stable, more oriented around Java and JavaScript.

Other

  • Number one change is the developer being the accountable persona for quality. That’s driving a lot of other decisions – key success factors for test automation. Devs are automating the entire process with CI/CD. Removes manual assumptions. Four key factors of success: 1) level of automation (must be 90%+ for CI/CD to succeed), 2) maximizing coverage – all flows, all datasets, now quality of app and UX have merged and is driven by digital transformation, apps seamlessly work on any platform or device, digital test coverage endless list of permutations, 3) efficient feedback loop with mountains of data – analyze and feedback to the right person, 4) integrated approach and platform for test across all devices – where to execute and analyze each test. Push left to be executed earlier in the lifecycle.
  • No huge technical changes. Containers have helped provide the application and the environment it runs in. It's easier to provide a properly configured environment for testing. Developers are still looking for what to test for based on what they’ve changed. Cut down on what you test by moving earlier in the SDLC. Static analysis of the code but it's difficult to know impact at runtime. Loosely coupled architectures lend themselves to testing more easily.
  • Introduction of Docker and container technology into the testing psyche. Used to have big QA teams. Now devs can express required testing environment. How has the division between devs and QA worked for others? Some companies don’t have QA. 
  • Monitoring has become the new testing in the microservices world. Testing in production-like environments having to spin up several services is expensive and the tests themselves tend to be unreliable and slow. Establishing thresholds to determine when your services have started to misbehave and monitoring for those thresholds allow you to be the first one to know if something went wrong and be able to recover faster. I see a clear evolution of reducing the amount of end-to-end testing performed in these kinds of distributed systems in favor of monitoring. There is a big challenge with mocked API integration tests and here it’s where consumer-driven contract testing plays a key role. Consumer-Driven Contracts is a pattern that drives the development of the Provider API from its Consumer's point of view, guaranteeing the provider won’t make changes incompatible with its consumers at any time. It gives you the power of end-to-end tests but with the speed and ease of writing unit tests.

Respondents

Automate open source governance at scale across the entire software supply chain with the Nexus Platform. Learn more.

Topics:
automated testing ,devops ,agile ,software development

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}