We'll make a little journey into the core XP values as defined in Extreme Programming Explained. After all, practices change all the time, while values are a more fundamental part of a methodology.
I'm not here to say I practice a particular methodology by the book, or that you should; in my opinion complex systems like a software project need a constant tailoring of solutions with respect to their context. That said, values give us a glimpse of the ideal that XP strives towards.
There is a well-known maxim in XP and Agile circles, when it comes to providing a solution to a software problem:
Do the simplest thing that can possibly work
We'll talk about complexity of a system later, but first of all let's notice what every [software] engineer can point out: simple solutions are not the same as easy solutions. Simple solutions are the ones that show more behavior with less code: it's the different between a single dense procedure 100 lines long and the collaboration of a few objects each 20 line long (with length as a rough, effective measure of complexity).
For example, recently I've intervened on a feature that let the application request billings of a user and synchronize them later with the ones actually performed by an external entity (in our case a mobile carrier). The key part of this feature was a little engine that received messages from both the application and the carrier.
The first iteration of this solution looked like 4 objects in collaboration, using 2 MongoDB collections. After refactoring and a bit of emergent design, we came up with 3 objects, using just one collection. Not only the second solution was simpler, but it required more time and reasoning to begin with; and it was more robust than the original, easy one as the potential for race conditions was much lower with a single collection than with multiple ones.
Solutions with fewer moving parts
During the implementation of the same user story, we also adopted what is becoming a standard solution in the project: using a little persistence mechanism during OAuth and other 3rd party authentication that requires the user to be redirected out and then redirected back. The goal is to overcome limitations in the information that can be carried in the URL, since often you can just insert a single value in these proprietary APIs:
This solution required a new object to be introduced, but it solved all error-prone and inconsistent encoding schemes we were using before, such as passing sometimes values in the xx_yy form (for single purchases) and sometimes in the xxxx form without underscores (for subscriptions). We introduced additional classes, but the total amount of code needed for the feature decreased.
Here I go with Alan Kay:
Most software today is very much like an Egyptian pyramid with millions of bricks piled on top of each other, with no structural integrity, but just done by brute force and thousands of slaves.
One of the values in the programmer's ethic should be to not maximize your consultancy income by introducing additional components, machines and code. The goals of the development team and the business should not be misaligned like this: on one side to get more consultancy and support work to increase billable hours, and on the other to get value frequently and reliably.
The 4 rules of simple design from the XP tradition speak about Simplicity too. A design should, in this order:
- passes all tests
- contain no duplication
- express all concepts needed
- minimize the number of classes and methods (and of all that is composed by them)
These rules keep functionality at the center with respect to other Lean approaches which call for evaluating the trade-off of features and complexity costs (but these are Lean values, not XP values). In any case the usage of user stories prompt a collaboration with the business users of a software to discover:
- which part of the scope is essential
- which part is not necessary or just linear (see the Kano model)
- which is the value behind the story, and if it can be reformulated to provide the same value with a much smaller cost by using less technology
Fewer mass in a design leads to more robustness (fewer gears in the machine can break) and maintainability (it's easier to change less code). The physical metaphor you can use here is that the fast-growing mass of a project and its inertia cause it to be difficult to steer around.
What about bugs?
Once you gain terminal velocity while falling like a rock (fixing bugs all the time) it's very difficult to brake and turn back up.
There are only two ways to design something: so simple that there are obviously no deficiences, and so complex that there are no obvious deficiences.
In simplicity-oriented software, bugs can be replicated with unit tests and fixed when they happen. The separation of parts means you can unit tests instead of peforming end2end test, and it's easy to spot the bugs. The time to produce a fix stays very short.
It's often said the legacy code is bug-free and difficult to change because lacking tests bug will be introduced. I'll break it to you: legacy code already contains bugs. However, there is a bias at work: the code is in a form which lets you not detect the bugs since it's so difficult to understand. Refactoring the code into simpler parts lead to exposing these bugs which were once impossible to think of. Clean code is difficult to write in the sense that bugs cannot hide, while legacy code can put them under the carpet for months.
What about extensibility?
One of the assumption of Agile methods is that while creating a solution, we know very little about its possible future evolution. So, we wait until the evolution is actually needed before developing the seams for it. Meanwhile, we keep the design as small and clean as possible in order to 1) not waste work on directions that may not be taken and 2) be able to steer in the future direction when it will be necessary instead of having 100K lines of code to pull around every time there's a requirements change.