Stand back, I'm going to try science!
Stand back, I'm going to try science!
Join the DZone community and get the full member experience.Join For Free
[Latest Guide] Ship faster because you know more, not because you are rushing. Get actionable insights from 7 million commits and 85,000+ software engineers, to increase your team's velocity. Brought to you in partnership with GitPrime.
(the title of course references Xkcd)
In these days in Italy, the creators of the Stamina method for the treatment of neurological diseases are under fire from the media which have finally discovered they're killing people. Here's a brief recap and an explanation of what this has to do with our fashion-driven software industry.
A psychology professor, Davide Vannoni, supposedly cured his partial facial paralysis by attempting an unproven stem cell treatment in Ucraine. He then attempted to bring the "cure" in Italy, where due to unclear pressures from managers in Lombardia's healthcare was adopted on a dozen cases.
The ingredients of the Stamina case:
- a method kept secret and unproven by any scientific research.
- A "scientific" publication that copy-and-pasted images from existing papers.
- Vannoni using his credentials of psychology professor to fake being a medical doctor.
- The cure introduced in a public hospital, probably with the excuse of being a n expanded access without any clinical trial being in progress.
- National television and related media pressuring for the permission to use the cure in public hospitals.
- A Parliament allocating 3 million Euros for trials due to public pressure, trials than blocked (for now) by a scientific commission of the Health ministry due to "no scientific consistency" and "danger for patients".
Here are several statements made by Vannoni in a recent interview to make his case. With an extreme comparison (we're talking about the lives of profoundly ill patients here) concede me to port them into our environment, as the reasoning is reminescent of methodology adoption arguments.
Which are the risks for the patients?
Only the ones related to the sudden interruption of the cure, which does not depend from us. No one has ever suffered from side effects.
Pattern: the risks are only there if you stop doing X, because someone else decides it. No one has ever being damaged by X.
In software: the risks are only there if you stop following Waterfall/RUP/Scrum/XP/Devops/Kanban, because the CTO decides it. No one has ever being damaged by following Waterfall/RUP/Scrum/XP/Devops/Kanban.
Many patients improve after the first treatment, no one is worse off and many remain alive for long.
Pattern: X improves after a few weeks, no one is worse off and X remains in place for long.
In software: many teams improve after adopting Waterfall/RUP/Scrum/XP/Devops/Kanban for a few weeks. No one ever suffers and they continue doing Waterfall/RUP/Scrum/XP/Devops/Kanban for long.
That patient died of pneumonia after being forced to stop the treatment, while his relatives begged us to continue.
General: that adverse effect was due to having stopped doing X.
In software: that case was due to the team stopped doing Waterfall/RUP/Scrum/XP/Devops/Kanban (changing its rules/not changing it to suit their environment).
We like to compare ourselves to doctors, but our industry does not often have the rigor not the methodology to perform trials and studies on new ideas to test their effectiveness experimentally, like science requires. Subfields vary in their rigor:
- computer science is sometimes considered a branch of mathematics. Proofs provide validity of ideas: the worst case of quicksort is demonstrated as better than the worst case of bubblesort.
- Systems engineering is also strong, supported by performance measurements. In Facebook and Amazon papers you read about reproducible experiments that confirm their results in serving millions of test requests with a certain architecture.
- Machine learning is similar: there are validation and test sets to decide on the effetiveness of new algorithms, even if they are not related to black and white questions but to computer vision.
- Organizational theories sometimes have models on which simulations can be run. For example Theory of Constraints lets you move several hundreds items to be worked on through a system to study what happens when changing things around (such as moving bottlenecks). Whether the model and its parameters apply to reality is another things: drugs can be tested in vitro but have to pass further stages with human trials before being approved.
- Methodologies... are mostly based on fashion. But don't tell anyone that The Three Ways Of Devops are an incomplete subset of Lean.
I don't know if clinical-like trials testing the effectiveness of a methodology are even possible given the rate of change in our industry, and the fact that we are dealing with highly non-linear systems: people and their teams. Consider however that XP has been around for almost 15 years and there is still no full agreement on whether it works or not. Surely 15 years is enough for conducing a trial?
The evidence for many of the concepts we preach every day is anecdotal, especially as they move from the realm of machines and programs to the one of people and their interactions. If we want to take our industry to the same level of importance of the medical field, we need to improve substantially some of our methods to include experimentation and clinical-like research.
This is already true in certain settings where it's easy to do - such as performance measurement - and in the small scale, such as trying out a single new practice within a team. Repeatability of results is instead rare.
We say it's only a matter of time before quality of life, and life itself, critically depend on software systems. Yet our methods for producing them sometimes resemble wizardry and superstition: in some fields of the software industry, everyone is a Vannoni.
Opinions expressed by DZone contributors are their own.