, but they are not not
. They find their way into code because:
Defects are only corrected by understanding pathways and debuggers are not the best way to do this.
Debuggers are commonly used by developer's to understand a problem, but just because they are
does not make them the
to find defects. I'm not advocating a return to "the good old days" but there was a time when we did not have debuggers and we managed to debug programs.
The absolute best way to remove defects is simply
to create them in the first place. You can be skeptical, but things like the
Personal Software Process
(PSP) have been used practically to prevent 1 of every 2 defects from getting into your code. Over thousands of projects:
The Personal Software Process increases productivity by 21% and increases code quality by 31%(1)
A study conducted by
in 2002 reports that software bugs cost the U.S. economy
annually. This huge waste could be cut in half if all developers focused on not creating defects in the first place.
Not only does the
focus on code planning, it also makes developers aware of how many defects they actually create. Here are two graphs that show the same group of developers and their defect injection rates before and after PSP training.(
|Before PSP training
||After PSP training
One of the most enduring ratios in software development over 50 years is the ratio 1:10:100 which represents the relative cost of finding defects:
- In QA
- In deployment (i.e. by the customer)
If you are primarily using debuggers to find defects then this is in QA or deployment, which makes your costs 10 to 100 times that of any pre-test defect removal strategy. The pre-test defect removal strategies are those like planning, TDD, code-inspections, and design by contract.
It costs 10 to 100 times as much to find defects once they get to QA.
to understand the source of a defect is definitely one way. But if it is the best way then why do poor developers spend
more time in the debugger than a a good developer? (see
No Experience Required
That means that poor developers spend a
in the debugger for every 2 hours that good developer does. No one is saying that debuggers do not have their uses.
However, a debugger is a
and is only as good as the person using it. Focus on tools obscures lack of skill (see
Agile Tools do NOT make you Agile
If you are only using a debugger to understand defects then you will be able to remove a maximum of about
of all defects, i.e. 1 in 7 defects will always be present in your code.
Would it surprise you to learn that their are organizations that achieve
97% defect removal
Software inspections take the approach of looking for all defects in code and getting rid of them. Learn more about software inspections and why they work here:
Software inspections increase productivity by 21% and increases code quality by 31%(1)
Even better, people trained in software inspections tend to inject
fewer defects into code. When you become adept at parsing code for defects then you become much more
aware of how defects get into code in the first place. But interestingly enough, not only will developers inject fewer defects into code and achieve defect removal rates of up to 97%, in addition:
Every hour spent in code inspections reduces formal QA by 4 hours
As stated above, there are times where a
will use a debugger correctly. However, if you are truly interested in being a software professional then:
- You will learn how to plan and think through code before using the keyboard
- You will learn and execute software inspections
- You will learn techniques like PSP which lead to you injecting fewer defects into the code
You are using a debugger as a crutch if it is your primary tool to reduce and remove defects.
Want to see more sacred cows get tipped? Check out:
Make no mistake, I am the biggest "Loser" of them all. I believe that I have made every mistake in the book at least once :-)