Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Coupling and Cohesion: Confounding Yet Critical Concepts

DZone 's Guide to

Coupling and Cohesion: Confounding Yet Critical Concepts

Ideally, you want all your software to have low coupling and high cohesion. But only the Sith deal in absolutes.

· Agile Zone ·
Free Resource

A deep but often-overlooked problem that perennially plagues software is that of high coupling and/or low cohesion. The underlying concepts are foundational to good software engineering, as they impact how hard it will be to comprehend, extend, and maintain a particular piece of software. Yet they can be quite confusing, and attempts to deal with them can be very misguided. Because a shallow understanding can do more harm than no understanding at all, the subtleties of these related concepts need to be explored.

To review very briefly, coupling is the degree to which distinct classes, modules, etc., are tied together, such that a change to one requires changes to others. Cohesion is the degree to which a class or component deals with just one thing. But rather than thinking of cohesion as tight internal coupling, think of it as the conceptual purity of a unit of software. This avoids some misunderstandings that can arise otherwise. I won't dwell further on the definitions, as others have done a good job of that.

Of course, the goal is to have low coupling and high cohesion, as this seems to give the most understandable and maintainable code structure. It can seem as if they are polar opposites, because an increase in software entropy is likely accompanied by both increasing coupling and decreasing cohesiveness. But they are not diametrically opposing characteristics, as it is possible (although less likely) to have high coupling and high cohesion at the same time.

Some coupling metrics only look at software units in a pairwise fashion. But coupling can be such that a change in one place requires changes in multiple places in of code—in order to maintain consistency of naming, or to pass new parameters across process boundaries, just to name a couple of examples. (Although I disagree with this post's conclusion, the author's explanation of coupling's ripple-effect nature is worth reading.) If you could measure the strength of coupling, combined with the distance over which every instance every possible change would propagate, you might have the beginnings of a useful coupling metric. Of course, this is not feasible, and coupling has to remain somewhat subjective, rather than rigorously measurable.

Since not every change to code is equally likely, another factor that should ideally be taken into account is the probability of potential changes getting made to an application. Tightly-coupled code that is rarely touched doesn't cost that much extra. But as above, measuring this is not practical.

Metrics can be misued in many ways, one of the worst being that of comparing different developers' work. This is a misuse, because unless two developers write solutions to the exact same problem, their coupling/cohesion metrics are not comparable. Even if you had a way to normalize metrics for application size (and please tell me you would not just divide by lines-of-code), different problems have different amounts of essential complexity. This, in turn, has to affect the coupling in an application. Therefore, to judge someone harshly because their application is more highly coupled is short-sighted, to say the least.

Artificial attempts to reduce coupling can also be misguided. Dependency Injection (DI) is a popular technique that supposedly reduces coupling. But if you diagram an application before and after DI—being sure to include every real link, you will see that the amount of coupling is actually higher with dependency injection. (See one list of DI's disadvantages.) This a case where a shallow but "cool" solution to a deep problem actually makes things worse.

Even refactoring a class into several smaller classes can be a bad idea if the total amount of coupling (or even the amount of code) in the system has to increase. (After all, it's still solving the same problem as before you refactored.) Research into coupling has shown that even well-structured applications seem to naturally contain a small number of modules that have to be highly coupled. This suggests that we should not expend great effort to blindly follow simple rules, while ignoring what those rules are forcing us to do to our code.

If software has timeless problems, we should expect the solutions to be equally timeless. Software engineering would be better served if we kept our minds on deeper principles like coupling and cohesion, instead of the latest trendy language, framework, or development fad. We should focus our energy on structuring and then growing software in ways that actually have a large and long-term positive impact on its cost to build and maintain.

Topics:
software complexity ,software design principles ,software principle ,coupling ,cohesion ,agile

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}