George Dinwiddie ( http://blog.gdinwiddie.com) has written a bit about what he believes. I respect and agree with most of his conclusions, but I do disagree with his attitude on measuring developer's capabilities. I don't want to pick on George, but his are the most recent articles I've read on this topic ( http://blog.gdinwiddie.com/2011/01/15/software-craftsmanship/, http://blog.gdinwiddie.com/2011/01/17/trades-crafts-and-certification/), and they make the same mistakes I see everywhere when discussing performance tests. To wit:
G---- is a certified tradesman. G---- did horrible work. G-----'s work was certified acceptable by a third party, also certified. Therefore certification doesn't guarantee excellent (or "craftsmanlike") work.Now here's the fallacies in that argument:
- Certification, by itself, only guarantees that, at least at one point in time, the practitioner knew how to do excellent work (or at least, work up to the level of the certification standard). It doesn't mean (s)he will always do it.
- One (or two, or ten) examples where certification doesn't do what it was intended to do doesn't mean that certification never works, or even that it seldom works.
- The certification in question when this sort of argument is made seldom, if ever, covers the issues of craftsmanship under discussion.
- A certified individual, like most people, may not do the same level of work every day. Even master craftsmen have bad days or weeks or even years. In trade work, unfortunately, you can't always throw it out and fix it; there are schedules to meet.
I'd be the last to contend that certification is the only ingredient of a healthy craftsmanship movement or in developing excellent programmers. It's simply one tool--a way to measure oneself both for one's own edification, and to help others gauge your breadth of knowledge. I think it's a particularly important tool--a pillar, if you will, along with other pillars, such as the community of like-minded individuals against whom you measure yourself, and who can comment on your work.
I think it's important to review what certification means. It usually means "at some point in time, this individual knew and could put into practice certain tools, rules, and practices which, by this certification, we have verified". Usually the certification standard is generated by people who really know the subject, and are qualified to determine what parts of the subject are key to being effective. Certification doesn't say anything about a practitioner's willingness to apply what (s)he knows--that's a different problem, solved in different ways (e.g. establishing and maintaining a reputation, publishing your work for review, etc.).
What I'd really like to see is some kind of actual data comparing the performance of programmers (or plumbers, or electricians) who are certified in some way to those who aren't certified, on tasks related to the certification. I see a lot of opinions (mine included) bandied about without actual, real, verifiable data. One of the things that makes DeMarco and Lister (Peopleware) compelling reading is that they provide actual data. Real studies with results that aren't anecdotal. They're not just voicing "opinions developed over years of practice"--they (and we) should be conducting actual experiments to verify our theories.
Craftsmanship, in my mind, requires:
- an attitude that it's worthwhile to create excellent work, to provide the necessary stimulus,
- access to people who can teach and/or materials from which a practitioner can learn what that means and how to accomplish it, to provide the necessary knowledge and experience,
- some standard ways to measure a practitioner's progress from neophyte to mastery, to provide a mechanism for the practitioner to understand where (s)he is and where (s)he needs to go, and
- some mechanism for displaying the practitioner's achievements, so that mastery carries real benefits in the working world, thereby incenting mastery in others, and helping people outside the craft or trade to select workers appropriate for their tasks.