Granted, the first successful mainstream language to have exception handling did a crap job with them, but the fact that two of the newest languages completely eschew exceptions seriously makes you question whether evolution really has it right, or if Jimi Hendrix‘s fable in 1983 is closer to reality: we evolve to a the point of spoilage then we retreat. Whenever I ran into someone who thought Kernighan and Richie was a classic, I would argue that it was an ugly lie. Because it sold you the dream of terse, elegance, typified in the single line file copy (the crescendo of their masterpiece) (while (dest++ = source++) ;), but of course that is fake code because it doesn‘t have all the error handling you will need to survive in production. Once that is put in, it will start looking like the WIndows source code that made a generation of programmers who got a look at it think about working at a big box retailer instead.
Of course, you could release without the error stuff in there. That‘s what Microsoft did. Hey we can put that in later after the users tell us where it has to go!
Or you could go buy a copy of Steve McConnell‘s book Code Complete and learn how to program defensively. Basically boils down to ‘put in the stuff that occurs to you, but don‘t wear yourself out, you can‘t think of everything, so when crap fails, you can just add new defenses.‘ Yeah think about that. Granted, the other extreme, the Maginot Wall, is equally stupid, but sheesh.
To me, one of the great chapters in the best book on Object Oriented programming, Bertrand Meyer‘s Object Oriented Software Construction is the one on exceptions, which does two things:
- completely disassembles the whole idea of defensive programming and how it basically posits code that can never be said to reliably do anything, as it‘s by definition awaiting its next malfunction at the hands of unanticipated usage
- disabuses you of ever considering the idea that exceptions should be handled in the core of the application (specifically the database)
Meyer basically says look: you are not going to traverse layers properly and even if you did, what is the point of taking garbage all the way down to the core only to have to send the obvious ‘message‘ back to the top?
How did exceptions just get dropped? There are a couple projects that bolt them onto the side of Swift. But the published books on Swift illuminate their preferred path and jesus it is ugly. It‘s the once glowing city slurped down into a fiery pit of retro goofiness. Go is not much better. The Go manual makes you think you are being treated to error handling caviar, but its major innovation was that you don‘t assign the results of operations. Uh, ok, thank you. That‘s kind of like being told ‘we aren‘t going to kill you but we are going to bind your feet and feed you a diet of bugs and dirt.‘
But then maybe this is for the better: the actor guys, specifically the Akka team, make a pretty compelling argument that error propagation in source code is never really going to work very well.
There is syntactic taint, ugly little things you can hold your nose through, and then there are structural abominations that are so pervasively unsettling that it makes you wonder how you would go on. Having written a lot of Swift in 2014 at first I thought I didn‘t miss exceptions much. But now that I am adapting some of that code, I am changing my mind.
I was going to write a post about objects in general and a short exchange on Twitter with the great Jim Coplien this week, advancing the argument that most theories of change, from Hegel to Kuhn, were too simplistic, but that surely objects showed that in technology the new way, once it has finally been assimilated, becomes the old way with curtains. That‘s pretty much what the Go passage shows in spades: ‘ok, remember this horrid vestige from your forgotten criminal past? yeah you are back here, eating it again, but we doused it with some powdered sugar for ya.‘