Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Unit Testing the Hard Stuff

DZone's Guide to

Unit Testing the Hard Stuff

Approaching unit testing adoption from a leadership perspective, there are a few issues you should be aware of that your team will likely face.

· Performance Zone ·
Free Resource

xMatters delivers integration-driven collaboration that relays data between systems, while engaging the right people to proactively resolve issues. Read the Monitoring in a Connected Enterprise whitepaper and learn about 3 tools for resolving incidents quickly.

Given the nature of my work, I discuss unit testing with a lot of different organizations. Sometimes they're in the market for powerful tooling to improve an already-thriving unit testing practice. But often, they're just getting started with unit testing. Or, at least, they're trying to.

Adopting a commitment to unit testing never happens completely smoothly. Teams go off to, say, a TDD workshop where they learn the rhythm of the practice with the bowling kata. Enthusiasm runs high. It runs high, anyway, until they come back and sit in front of their actual codebase instead of an exercise application. Then it gets hard for them.

As a leader not looking into the code, you're going to struggle to understand their struggle. Is it simply that they're not used to unit testing, and they feel it slows down their normal development work? Or is it something else—something about your codebase—that makes it so hard? And if you do have untestable code, how can your team work around that issue?

Many Initial Objections are Simply a Matter of Learning Curve

First off, let's be clear. When you're new to unit testing, everything about it is going to seem hard. Struggling creates frustration and frustration creates some relatively predictable complaints. For instance:

  • "This whole unit testing thing has its benefits, but it just slows us down way too much."
  • "I really don't like this unit test framework at all. The API is unintuitive."
  • "Ugh, this makes no sense. That test should never fail!"

You get the idea. These are exclamations related to frustration in the moment. The practice of unit testing, the test framework, the test runner, and anything else in the immediate vicinity become targets.

This is a real struggle, and the frustration is completely understandable. It will naturally abate as the team fights through the initial difficulties and becomes proficient. Encourage them and reinforce their efforts, but don't confuse this with anything specifically "test-resistant" about the codebase. The team will have these difficulties on anything less trivial than the bowling game kata.

But There are Legitimate Barriers that Make It Really Hard for Your Team

That said, not all codebases are created equal from a testability perspective. Your team will struggle out of the gate unit testing on any codebase, but some codebases can make it insanely difficult for them. And it's important to look for signs of untestable codebases. This is because such a codebase can mean the difference between growing pains followed by a satisfying competence and a futile effort.

If you throw your team's unit testing initiative at the wrong codebase, the entire effort might be doomed to fail.

So here are some scenarios where unit testing is hard for anyone, even experienced veterans. And here's how your team can recognize and mitigate them.

Untestable Frameworks

First up, let's consider untestable frameworks that you might use. I'll pick on ASP.NET's "Web Forms," which encourages you to write code that is notoriously hard to test. This isn't to say that you can't make unit tests coexist with such a framework. It's just that a team will struggle disproportionately with such a framework when compared with the easier time they'll have with a similar one designed to be more test-friendly.

This is perhaps the easiest situation to recognize, both for the developers and for leadership. The developers are smart people and will probably come to you with blog posts and articles about how hard it is to unit test around the framework. And you can simply look this up for yourself, reading about struggles with a framework (or else not finding any).

Mitigating this is also conceptually straightforward. Assuming you can't simply sunset the application and start over, the focus will be on minimizing your dependence on the framework over the course of time. In the case of Web Forms, this would mean moving just about all code out of its "code-behind" and into testable classes. In another framework, it might be something different. But the common element is a concerted effort to reduce coupling to that framework so more of the code becomes testable.

Excessive Coupling and the Law of Demeter

Speaking of coupling, let's talk about that and the testing difficulties it creates. In the world of software design, there's a concept known as the Law of Demeter, which represents a specific kind of coupling that creates testing nightmares. If you're not familiar with it, consider a memorable analogy.

When you go to the store to purchase groceries, the clerk tells you the total cost. You then pull out your wallet, remove your credit card from it, and insert your credit card into the machine. Do you know what you don't do? You don't say to the clerk, "Reach into my pants, pull out my wallet, get out the second credit card, and insert that into the machine." Why not? Because it's inappropriate coupling. It's not the clerk's job to know where you keep your wallet and where in your wallet you keep your credit card.

Frightening as this would be in the physical world, it's actually pretty common in codebases. To understand if this is happening, there's a relatively simple mechanism for the team to use. Look through the code and see if there are a lot of statements in the code with multiple periods in them. This is not perfect, but it's a good quick barometer for whether you might have this problem.

If you do, the team should focus on factoring away from implementations like this and then start implementing unit tests as they do so. Trying to unit test code like this will result in excessive setup, brittle tests, and frustrated developers.

Static State and Singletons

Here's another design-related problem in the code. In this case, it's the prominent use of static state and the singleton design pattern. These constructs are handy, but they're a nightmare from a testability perspective. To understand it most simply, they take the kind of coupling issue created by violations of Law of Demeter and provide an easy mechanism to sprinkle it through your entire codebase.

Recognizing this as a testability struggle is a bit more subtle. For one thing, teams that have come to depend on static state and global variables are unlikely to perceive this style of programming as a problem, so they won't report it as such. Instead, you'll hear about how unit testing is hard.

But you'll hear more specific things too. You'll hear about tests that fail only intermittently or tests that take a long time to run. And you'll hear about unit tests that produce weird side effects, like writing files or messing up the staging database. In short, you'll hear about things that don't seem to make sense.

Luckily, the mitigation here is pretty straightforward. You can use a tool like Isolator to mock static constructs and isolate the things you actually want to test, saving yourself many headaches. Of course, you should also try to factor toward a more modular design as you do this. But at least you will have tests to check that you didn't add bugs while refactoring.

Talking to Things Outside of Your Application

The last source of testing pain that I'll mention is the one that arises from talking to outside concerns. I'm talking here about things like databases, web services, files, etc. Really, it's anything that's external to your own codebase.

This makes unit testing really hard because it stops being purely a matter of your code behaving the way it should. Tests that trigger the writing of files or calls to the database can fail depending on environmental concerns beyond your codebase. So look for the same sorts of intermittent test failures and long-running tests that static state will trigger. The difference here is that the reason won't be mysterious. It'll be quite clear when people say, "This is really hard to unit test because the database isn't always available."

Mitigating the struggle to unit test code with external concerns is also similar to mitigating static state. The key difference here, though, is that this is absolutely what an isolation tool is designed for. So use it to isolate your external concerns, but without the emphasis on changing your design. You don't need or want a lot of global state, but you do need and want to read files, call web services, and write to your database.

Always Ask Why It's So Hard

As I said at the outset, your team is going to struggle as they start to unit test. They'll initially struggle with the basics and the hard stuff alike. But they'll get past the basics on their own, whereas the hard stuff can defeat them.

So you need to quickly figure out whether they're struggling with the issues mentioned here. And doing that involves having conversations. Figure out what the team is struggling with and why by asking them questions and keeping an eye out for the signs mentioned here. It can mean the difference between a successful adoption and a frustrating, abandoned effort.

3 Steps to Monitoring in a Connected Enterprise. Check out xMatters.

Topics:
performance ,unit testing ,performance learning ,leadership ,frameworks ,law of demeter ,unit tests

Published at DZone with permission of

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}