How Is Software Like Cooking?
You might find yourself pondering a nice cut of meat and thinking about code.
Join the DZone community and get the full member experience.Join For Free
Time for a light-hearted post. After my move to the UK — and having had my share of fish and chips — I have become, by reaction, more interested in Italian culinary history and practice. So I started diving into the science and the tradition of cooking, reading books such as "The Science of Meat" that combine chemistry and good taste, and I have now cooked enough lasagna to build a statistically significant sample.
Disclaimer: this post is full of meat references. That's culturally significant as a metaphor to transmit the concepts I have in mind. You may find this distasteful if you have chosen to follow a different path.
So here are five ways in which software development and cooking are alike.
There's a joke in a Futurama episode about Bender (being a robot) not having a sense of taste and hence playfully disgusting the humans in the team with too much salt. The joke works because in cooking you need a continuous feedback loop to conform to your taste — for example, adding salt and pepper at the end of a preparation until it tastes right.
We are no strangers to this process in software development: most of the practices I preach about lead to getting working software in front of someone who will use it as soon as possible, to better steer future development with their feedback.
In the kitchen, there are even shorter feedback loops than tasting: the pace at which meat is browning will make you adjust your source of heat in that phase of preparation to avoid charring the surface to a pitch black color. Not too different from your unit tests failing, informing you of an issue well before it gets to an actual customer that will send the steak back to the kitchen.
Quality Is in the Eye of the Beholder
Taste has lots of different components, involving not just what your tongue perceives, but also how something smells, its presentation, and your expectations. For most of these aspects, quality is in the eye of the beholder. A chef can't avoid coming to grips with the variety of people and cultures in the world.
Despite how good you think your burgers are, some people just don't like fatty minced meat. I appreciate Indian curries, but I have some physical limits on the spiciness levels that make me almost always choose mild korma. And imagine the cultural shock of discovering I have been wrongly putting lemon in tea all my life, instead of milk as the only acceptable choice. (I know, tea doesn't even grow in Europe, but I grew up with British tea as the standard.)
We all have good intentions in thinking hard about what a user will enjoy or be productive with, but we have to recognize there is a vast variety of users and we have to design (or cook) for each of them.
Control the Process, Rather than Micromanaging the Material
Convection ovens are a great example of a controlled process for cooking uniformly. In the context of large pieces of meat or fish, this mainly means getting them to a uniform, high-but-not-too-high temperature to avoid overcooking. The oven fan pushes hot air all around them, heating the surface evenly. Air transfers less heat than, for example, water. There is time for the temperature to rise uniformly across your roasting chicken — rather than overcooking the outside and leaving other parts dangerously raw.
For large cuts, this process is pretty much impossible to achieve in a pan, unless you cut everything into slices thin enough to cook quickly. The oven-based process is much more convenient. You literally abandon your tray in there, checking from time to time if it's ready with a thermometer.
Generally speaking of enterprise applications and websites, I favor a process in which we catch bugs with multiple safety nets (up to user experimentation, if possible), rather than overdesigning for every possible problem. While you can think of possible scenarios to test endlessly, bugs are always going to happen — and it's more important to have a process in place to fix them and never regress because of new automated tests. That makes your software converge to a steady, stable state like a perfectly cooked chicken.
You Need to Measure
If you want to consistently cook meat to your liking, you can't escape using a thermometer to understand when it's ready. The center must reach a set temperature that corresponds to medium-rare (for a steak), well-cooked (for poultry), or something else. Looking at the external color? No relation to the inside. Checking how hard it has become? Too subjective. Roast for a certain amount of time? Ignores the variability of both the ingredients and the heat source.
When we perceive part of an application as slow, we need to use a profiler to find out what functions or methods are taking the most time to execute. Making as few assumptions as possible, we collect data to point us in the right direction. Opinions don't count: your browser timings and other metrics do (if collected correctly).
You Can Substitute Ingredients, to a Certain Extent
Corn starch and flour are used in small quantities in many recipes, with the goal of thickening a liquid. This is due to their starch content, as the carbohydrate granules swell up with water, creating enough friction to transform a liquid with the viscosity of water into something that feels like cream.
If you try to use corn starch to make bread, however, you won't be able to get an elastic product. Corn starch lacks the proteins that would build gluten. Even if you use the wrong flour (cake flour as opposed to bread flour, to keep it simple), this will greatly affect the result due to the smaller percentage of proteins that it contains. Baking, both for sweet and savory goods, requires much more precision.
In software development, we have grown up with Lego bricks as a metaphor. We continuously try to swap out pieces, hiding details behind a useful abstraction that sometimes leaks. Nowadays relational databases can be queried interchangeably if you stick to standard SQL queries. But the data types for columns can be pretty different in the range they support, especially if they are somewhat more exotic — like JSON and XML fields rather than integers and strings. A wise decision is still required to understand when substituting components is possible, or when some combinations will never work.
And here are five ways in which software development and cooking are very different.
Cooking Is a Repeatable Process
Recipes (at least the good ones) are the codification of a process that resists external variations — the goal is to produce a consistent result. It's the mark of a good cook to be able to deal with variations in ingredients or tools, but unless you're up on a mountain, water boils at pretty much the same temperature, and the physical transformation that your carrots undertake when they are heated is well-established.
In software, every new feature is a new design to make, rather than the execution of a plan. Even porting software or reimplementing it bears surprises, as the platform it's running on is now different. And no one understands how long it took to produce the original version, much less how to give an estimation for the new one being created. We have processes for understanding what a feature should do, and safely implementing it, and rolling it out — but there are always land mines waiting on the path.
Cooking Has Some Precise Physical Quantities You Can Rely On
Understood: measurement is needed in both fields. But as much as your oven oscillates around its target temperature, it is still much more precise than a developer's effort. Even without meetings and other time variables, our speed and precision varies on any given day. Humans aren't robots. Our level of knowledge about a technology greatly influences the designing and testing phases. The Mythical Man-Month remains, well, mythical.
In the food industry, the right tools can even measure the strength of a flour to check whether it's right for the bread you want to produce. If you look at a technology team, measuring how many tasks per week we have completed is probably as good as it gets. There are humans involved, and applying social science to a very small group probably doesn't get you very far in terms of collecting data and drawing inferences.
You can still measure other times objectively, like time to deploy: how long it takes for a commit on master to reach the production environment. We do this partly because it's important, but also because it's one thing that is feasible to measure. Most project managers would care more about the time from idea to complete implementation instead. But that requires estimating the length of a queue that changes all the time, and is just the first step of a creative development process with its own variations.
Determinism of the Digital World
It's pretty difficult to get the same tomatoes, courgettes, or grapes as last week, and pretty much impossible to get the same ones in-season and off-season. You can ship them in from South Africa or Australia, but travel time and refrigeration can modify their contents, and thus their taste.
If you look at a physical server, it's much more similar to laboratory equipment than to a living product: you can run programs and see them always taking a similar amount of time to complete, controlling the randomness of the operating system around it. This gets eroded a bit in the cloud, where performance may be affected by your neighbors due to co-tenancy.
Timing in the Kitchen
Whether it is simply changing the temperature of a meat joint, or a more complex transformation like baking a cake, timing should be one of the concerns if you want to obtain a good result. I formalize this concept by thinking that it's not possible to stop time.
Cooks know tricks like cooking eggs or rice to a certain degree, then cooling them down and finishing the process later when the food has to be served — or simply reheating the food if it's fully cooked. This works, but it's an ad-hoc process.
Consider the power we have in the digital world: firing up a debugger literally stops execution at some point in the life of the program, allowing us to look at exactly what we want in the right context. Since the state of the program is essentially the Matrix, we can slow it down, speed it up, and change things, causing déjà vu for your objects.
If you want to reproduce some computation, you can build a Docker image containing all sorts of dependencies and store it for future use. If you want to reproduce your perfect croissants, the only tools you have are a recipe and your own memories. Add the variations of ingredients — and even the temperature and humidity in your kitchen — and you can understand why scientific exploration needs a laboratory with controlled conditions to be able to make progress.
Cooking Equipment Makes a Difference
Basic tools like appropriately shaped knives can clearly change the outcome of your cooking process. A pressure cooker allows you to reach certain results that would take a long time with an ordinary pot of boiling water. A temperature bath (I don't own one of these) can help cooking meat evenly, only to then finish the process with a two-minute searing. Even a scale is just necessary for baking, as measuring ingredients like flour by volume has a 50 percent margin of error due to its compressibility.
On the other hand, consider how you can write code on your old laptop from the beach. You target an open source interpreter, and the end product will run on the same server that could accept strictly regulated banking software. As long as you can literally string bytes together, you can produce running software: everything else is just "icing on the cake." The ephemeralization of software tools due to virtualization and the large availability of open source platforms make digital startups a reality, whereas opening a restaurant remains a capital-intensive operation.
But there's more.
The Power of Metaphors
Metaphors can foster understanding of a new system, or lead us astray. They are powerfully transmitting a mental model, but that model has its limitations — and may even be less precise than a more formal model like a math analogy. But especially in complicated fields like cryptography, terms such as key and signature have popularized concepts to generation of students that would have otherwise found them very hard to think about.
I wrote this post for fun, but I stand behind most of the comparisons. That's all for now. You'll find me using Helm to ship my containers...
Published at DZone with permission of Giorgio Sironi, DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.