{{announcement.body}}
{{announcement.title}}

CI Test Automation Strategy

DZone 's Guide to

CI Test Automation Strategy

This conversation between Dr. Nicole Forsgren of GitHub and Angie Jones of Applitools reveals some of the considerations for a CI test strategy.

· DevOps Zone ·
Free Resource

What happens when you put two top thinkers together to talk about DevOps, continuous integration, and CI test automation strategy?

That’s what happened in November 2019, when Dr. Nicole Forsgren, VP of Research & Strategy at GitHub, and co-author of Accelerate: The Science of Lean Software and DevOps, joined a webinar with Angie Jones, test automation consultant and automation architect at Applitools.

If you haven’t heard two thoughtful people discussing test as an integral part of product delivery, you need to listen to this webinar.

Dr. Forsgren publishes an annual “State of DevOps” report, in which she uncovers trends that drive DevOps productivity.  So, when she sat down with Angie, Dr. Forsgren came prepared to discuss trends and results.

When discussing her State of DevOps reports, Dr. Forsgren said that a common refrain she hears is, “‘This isn’t for me, right?’ ‘It’s only for executives,’ or, ‘It’s only for developers,’ or, ‘It’s only for (fill in the blank).’” And, yet, her conclusions are based on reaching out to a range of stakeholders, which is what makes her report, and her discussion with Angie, so interesting.

Knowledge of DevOps changes constantly and Dr. Forsgren points out that the elite teams achieve levels of productivity that exceed peers who are still developing their DevOps expertise. So, let’s dive in.

CI Automated Testing As A DevOps Foundation

Angie led into a discussion of the State of DevOps Report. She said,

“So in this year’s State of Dev Ops report, you listed automated testing as a basic foundation for teams. Yet so many teams, they don’t get around to this until they feel they have matured. So there is a belief that they just need to focus on building the thing right now. And after that thing is built, then they can add the nice-to-haves like tests. So what does it mean to have automated tests and how can teams get started with this earlier?”

Software delivery and performance

Software delivery and performance


Dr. Forsgren spoke about how most immature teams fail to prioritize testing, and how her research showed that elite teams focus on building fast-running unit-tests into their build processes. Her research shows that effective unit test automation serves as a foundation to DevOps just as version control serves as an essential for release management.

One mistake she sees teams make is pushing tests off to the end. Elite teams add CI test automation early – and not full development database tests, but simple unit tests that can run in less than 10 minutes and validate code that has been written. As new features get added, the unit tests make it easy to expose regressions.

Dr. Forsgren said,

“Sometimes I’ll talk to developers and they’ll say, oh, I want to skip tests because I only want to run them at the end of the day or once a week. And like my face does this contortion.”

“‘What are you talking about,’ I ask?”

“They say, ‘Oh, well, it takes like nine hours to run it back overnight.’” And I’m thinking, ‘Something is wrong. We want to run these small incremental tests that give us this good fast [pass/fail] signal.’”

Test-Driven Development

Discussing the developer mindset about tests led to a quick discussion about test-driven development (TDD).

TDD

TDD


Dr. Forsgren noted that a common developer will say, “I’m a developer, not a tester.” But, she noted, testability helps drive development flow. Code designed with testing in mind helps drive efficiency in downstream processes, like the system and functional tests. She observed that testable code decomposes more easily and integrates more efficiently into team processes.

Angie agreed that developers rarely understand tests. She said that developers would tell her,

“Yeah, I know I should be doing it, but I don’t know how.”

In part, Angie noted, universities that teach software skills often focus on development skills and rarely focus on testing. Angie pointed out that she never learned anything about testing while she was studying for her degree.

“So, when it comes to TDD, most developers can find it painful,” Angie said. “Lots of developers have an idea about what to build, and thinking about testability just slows them down. Some even find it a waste of time. It takes them a while to become comfortable with the approach and understand what TDD brings to the product delivery process.”

“And,” Angie noted, “I’m not pro-TDD or anti-TDD. I think that TDD serves a valid purpose and should be used when appropriate.”

Dr. Forsgren noted, “The real value comes when you’re trying to deliver completed code during a sprint…” and testability is one of your acceptance criteria.

Value of Testers

When she spoke about TDD, Dr. Forsgren says her advocacy for getting everyone involved in testing leads people to ask her,

“So, do you think we can do away with testers?”

Actually, quite the contrary, she says. By having a test-focused mindset in the development process, the test team can focus on issues further down the road. Dr. Forsgren refers to test engineers as “test experts.”

Angie, who is one of those experts, echoed the tester’s perspective back.

“I just want people to test their stuff – test it early,” Angie said. “Because there’s nothing more annoying and a suck of time than testing stuff, and the basic of scenarios don’t work. Do you know what I mean?”

They shared perspectives on the value of testers in the product delivery process.

Team vs. Centralized Tests

Angie talked about the problems in centralizing test approaches.

“I’ve worked in companies where they might try to spread a testing strategy across like an organization,” Angie said. I’ll be honest – it just felt heavier.”

She talked about how trying to standardize testing approaches would devolve to discussions about which tool to use – and stop focusing on productivity.

In discussing DevOps teams, Angie asked the next relevant question:

“Last year’s report found that test automation had a significant impact on continuous delivery. But, you didn’t consider the relationship between these two. This year you did. And you found that the automated test positively impacted continuous integration by making it better.

“Now, I tend to hear teams say that integration tests slow down their C.I. The tests take too long to run. They’re brittle. And so on. So as opposed to running them on every build, the teams are doing things like running them maybe once a day, right? So if they fail, their thought process here is, ‘Well, we have a list of check-ins for that day, better than, like that week or that month.’ And they triage it this way.

“So, Nicole, do you see any drawbacks to this approach and how might people change what they’re currently doing to enable faster feedback?”

Dr. Forsgren gave an, ‘it depends’ reply because she understood that each team has its own understanding of the kinds of tests they can and should run regularly.

“It can be tricky,” Dr. Forsgren said. “Because it kind of depends on what your test system looks like, what your build system looks like. If you have a whole bunch of tests and then your build system is tightly coupled – where if you’re just waiting for, like, a nine-hour build – that can be really, really difficult. But if you have a way to run small, smaller tests so you get fast feedback that that can be nice.”

“Those tests need to be designed for a C.I system,” she continued. “And, it can also help if you have things designed so that you don’t have branch merge conflicts and things like that happening.”

Angie’s View on Test in C.I.

Angie and Dr. Forsgren went back and forth on this topic – noting that the quality level of tests run in a C.I. system differ from the quality of tests that might be run elsewhere. In C.I., a failing test becomes a process gate – meaning that tests must be free of failures that might affect non-C.I. tests.

For example, a test that depends on pre-set conditions must have those conditions set correctly before it runs. If that tests fails in a place where someone forgot to pre-set the conditions, that can be an annoyance for a test run – but that would break C.I. tests.

Angie writes all about this in the DZone Guide to DevOps, and you can read more about her contribution in her blog.

Her core ideas are that there are four key requirements in automated CI tests:

  • Speed – CI tests must run quickly to give fast feedback to the team when failures occur. Most should be unit tests. Some should be at the service level – to uncover failure modes that unit tests may not uncover. Finally, few should be UI tests – they take the longest to write and the longest to run – and should be ones that uncover unique failure modes.
  • Reliability – Angie already mentioned reliability, and here she reminds people about the key in running automation in a CI environment is that a failure for an automated test must be tied directly to a testing error – not a problem with the test itself.
  • Quantity – Since every test gets run in CI automation, be aware that each test takes time to run. Don’t add tests for no reason. Make sure each test captures a relevant failure mode.
  • Maintenance – Your tests will need to be maintained as or immediately after each incremental coding task gets completed.

Approaching Test Maintenance

When it comes to maintenance, Angie made it clear that her perspective on test maintenance has changed in her career.

“I’ve been doing automation for a very long time,” Angie said. “But this is something I’ve just learned in the last, maybe, five years. That, it’s OK to delete these things. Like, maybe this test was very important at the time that it was written. Right? But now the data is showing us this is no longer that important. Maybe it’s covered by something else. Maybe we thought this was gonna be the feature that, you know, saved the company, but nobody’s using it. So, maybe we don’t need this.”

Maintenance

Maintenance


Angie talked about the key being knowing what to automate.

“People think they can simply automate their manual tests,” Angie said. “When in doing that, you’re slowing down your C.I. build. Even if you run these in parallel, still, you’re slowing them down. And, it’s a lot of noise.”

“Let’s say you just automate your manual tests,” Angie continued. “Now, you make a change, and suddenly 50 test fail. Why are 50 tests failing? Well, a developer has to go through all 50 failures to see every single what broke. And guess what? All of them are failing for the same reason. Why do I need 50 things to tell me that I broke this one thing?” You have to be thoughtful when you add automation, Angie added.

Continuing, Angie said, “Here’s a way to do maintenance when you’re not sure about the importance of a test. You have a failure. That test is gating your integration and your deployment. You can delete the test because it seems like it’s not that valuable. Or, if you still want to know this information, move it out to another build. You can have more than one build. Have your important tests run as part of your main build. And then you can have, like, alternate tests run once a day or something like that. So you’ll still have that information. But it’s not part of your C.I. build.”

“That’s my favorite pro tip for today,” commented Dr. Forsgren. “I learn something new every time I talk to you or work with you.”

Value of Designing for Test

Angie returned to discussing Dr. Forsgren’s findings from the State of DevOps 2019 report.

“So,” Angie said to Dr. Forsgren, “your study showed designs that permit testing and deploying services independently help the teams achieve higher performance. It’s really interesting that you found that design considerations should include testing because so many people don’t usually think of testing until after the features develop. Which makes automated testing much harder.”

Angie then spoke briefly about Rob Meaney’s 10 Ps of Testability, which concludes that for a product to be testable, it must be designed to facilitate testing and automation at every level of the product. And that process should help the team decompose work into small, testable chunks.

Dr. Forsgren replied, “A good architecture contributes to performance. The way we define performance is speed and stability. This loosely coupled architecture, along with communication and coordination among teams, lets those teams test.”

“And it’s a fine-grained community without fine-grained communication and coordination,” Dr. Forsgren continued, “so that can be provisioning storage and compute. It can be provisioning test resources and conducting tests. But, if you somehow if you skip a piece of that process – let’s say you provision compute and storage, but you skip the test section, you’re going to have a constraint. You’re going to have something that surprisingly blocks you.”

“People often view testing as the bottleneck to actually get this feature out the door,” Angie said.

“And when you do it wrong, of course,” Dr. Forsgren replied.

“Exactly,” Angie said. “If you don’t build that testability in, then testing is going to be more difficult. Yes, of course it has become this bottleneck, but that’s not due to the fault of the testers, right? That’s because the application was not built to be tested.”

Making Time for Automation

Angie focused on the difference between the research results and the reality in many places.

“Your research concludes that automation is truly a sound investment,” Angie said. “It allows engineers to spend less time on manual work and instead they can focus on the new work.”

“Yet,” Angie continued, “what I’m hearing from practitioners is this:  ‘Time is only allocated for feature development. We don’t have the time energy. We don’t have the budget, Angie, to automate because it’s not considered a feature.’”

Angie asked Dr. Forsgren how to get buy-in to allocate time for automation.

Dr. Forsgren acknowledged the problem.

“I hear this constantly,” Dr. Forsgren said. “Development is so much easier to justify. We can always justify features. We can always justify shipping something.

The HP LaserJet Test Case

Dr. Forsgren said, “There’s this fantastic, fantastic case that I use all the time. The HP LaserJet firmware case. Gary Gruver and his team needed to find a way to help improve the quality at HP LaserJet and its firmware. For people who say, ‘Automated testing is too hard.’ you’re probably working on software. Gary’s team had a huge problem with their firmware.”

“Gary’s team had so many constraints,” Dr. Forsgren continued. “Each printer could have multiple code versions. His teams spent just 5 percent of their development time writing new features. They were spending 15 to 20 percent of their time just integrating their code back into trunk. Because of the number of versions to check, for each build, it would take them a week to figure out if the build actually integrated successfully.”

“So they just started chipping away at the problem. Their goals were to free up the critical path and give themselves more time to innovate.”

“Their solution was to innovate in test automation. Today they spend something like 40 percent of their time just for automated testing because they had seen so much benefit in automated tests. Why? Because they had a huge quality problem. Before they made the automation investment, they were spending 25 percent of the time on product support. When you think about it, product support time comes from customer problems making it all the way to the field.”

As Dr. Forsgren pointed out, in the end, the HP LaserJet team used automation to improve quality throughout the process. Their builds ran much more quickly. They wiped out pretty much all the product support work that arose due to quality issues. And, they were able to spend time on innovation.

Tying Back to C.I.

Angie took the HP firmware story back to the C.I. case.

“I’m glad that you shared that story,” Angie said. “A lot of teams fail at this, and so you hear mumbles throughout the hallways of companies, or at conferences, where people who have been burned by attempting to do automation, they’re saying things like, ‘…this automation thing. I don’t see the return on investment.’”

“That’s why I really value the work that you’ve done, the stories that you share, what you’re seeing out there,” Angie continued. “The research is showing that there are teams who are able to do this properly. And when they are, they are definitely seeing a return on their investment. So I love that.”

“I love hearing the stories about what you see,” Dr. Forsgren said. “I love to hear where your experience mirrors the research, as well as where you see things a little differently. I especially enjoyed the tips that you shared on how to develop automated tests specifically for C.I., because so many more teams and organizations are doing continuous integration.”

“It’s clear to me that, if you don’t have your automated tests developed the right way, it’s hard to pull those into your C.I. flow,” Dr. Forsgren continued. “I’m going to reread your article because I’m super excited. I had somebody asks me for tips, so I can point them to the article.”

“Definitely send them my way,” Angie said. “Thank you so much.”

Topics:
ci ,developer ,devops ,software developent ,test automation ,testing

Published at DZone with permission of Michael Battat . See the original article here.

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}