Stop Anticipating Change
Anticipate or accommodate: these are the two traditional ways of making future changes. Accommodating changes is better than anticipating them, and here's why.
Join the DZone community and get the full member experience.Join For Free
I generally find that developers engage in one of two different activities when it comes to making future changes: they either anticipate or accommodate.
Anticipating change comes from a good place, a place of caring, but can be ineffective and stressful. I spent much of my career trying to anticipate code changes so that my software would be able to handle them if they ever came up. This was a losing battle. Change will always come from a direction I hadn’t anticipated. Anticipating change is no fun. It’s fraught with stress and we know we’re going to be wrong at least some of the time.
Rather than anticipate change, developers must rely on a series of standards and practices that support accommodating change when it needs to happen.
There are several advantages to this. First of all, we are not making changes sooner than needed. A great deal of a developer’s time is lost to overbuilding and gold plating, simply because they don’t know how their code is going to be used. But if they get clear on that by defining a good set of acceptance tests, then they know they can move on when the test passes and if later they need additional functionality, they can address it then.
One of the main reasons developers don’t have time to do important things is that they’re wasting a lot of their time on unimportant things.
I believe that the software development industry must have a series of practices to help us accommodate change. Such practices do exist, but they are not well known. They are not taught in schools. They are not sung from the rafters.
But they should be.
I have devoted my life to discovering the core pieces of knowledge that great software developers must possess. What I found is that there aren’t very many of them, and that they are easy to learn. But what holds us back is having a shared context for software development. In other words, different developers have different ideas about what software development should be and because we all have different goals and objectives, it’s very difficult to be aligned on principles and practices.
By shared context, I mean: What’s the purpose of software development, or its overarching goal?
Research shows that 80% of the cost of software happens after the initial release. Given the enormous cost of building software and the much more enormous costs of maintaining and extending software, I believe the context of software development must be that software has to provide value not only now but also into the future. In order to do that, in order to provide longevity, we must build software in a very different way than it is mostly being built today. We must build software to be changeable.
And we can.
The things I wrote about in Beyond Legacy Code: Nine Practices to Extend the Life (and Value) of Your Software and teach in classes, all support the notion of building maintainable software. It’s not hard to do but we have to pay attention to it.
When these ideas start making their way into college curriculums and become part of what it means to be a professional software developer, the industry will really take off and every other industry will benefit from this well-defined context.
Published at DZone with permission of David Bernstein, DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.