If you consider yourself a good programmer or at least you think your level is above average, I do not recommend reading this article. This article is meant for the managers of software projects. I would like to discuss here some issues that are quite important, but very boring (for programmers), related to the methodology of static code analysis.
I hope, this introduction has stopped some programmers from reading. Someone may still continue reading, but this is free will. We are going to speak about things that aren't very pleasant for the programmers.
Our company develops a static code analyzer PVS-Studio, meant to look for errors in the code of programs, written in C, C++, and C#. The idea is simple: we run the analyzer and examine those code fragments that seemed suspicious to it. In general, this is a kind of automatic code-review.
For any manager and the programmers themselves, it is clear that high-quality code is better than low-quality. It's also clear to everybody that the fewer glitches in the program, the better.
But here comes the most interesting part. Although everybody understands the importance of the high-quality code, programmers often get furious when they are offered to use static analysis on the code. It seems as they were offended, their professionalism was questioned, and they are eager to reply with all their arsenal of negative criticism. Over the years we have seen such a large amount of brickbats like this:
- "This is a tool for students; in our team, we have only experts. That's why if we have errors, they are only related to the synchronization of the threads."
- "We do not use static code analysis as we hire only professionals who are above average."
I think that if a project manager would be present during such discussions, then quite a lot of programmers would get an old-fashioned look for the arrogance. Every manager can recall a situation when they were looking for a bug which turned out to be a stupid typo. Or, there were cases when the manager started being very skeptical: if everything is fine, why does the application continue crashing from time to time? The managers don't see the process of the project development and the appearing issues as positively as the developers.
Figure 1. Programmers are often too rosy, being convinced that everything is fine.
So I would like to unravel a mystery, which is, of course, is not a mystery at all. Programmers have a trouble with estimating their level of proficiency. An article "The Problem With 'Above Average Programmers" gives a nice explanation of this effect. I quote an excerpt:
"How would you rate your programming skills? (Below Average, Average, or Above Average)?"
Based on psychological studies across many different groups, about 90% of all programmers will answer "Above Average."
Of course, that can't possibly be true. In a group of 100 people, 50 are above average, 50 are below average. This effect is known as illusory superiority. It is described in many spheres, but even if you haven't heard about this, you will most likely answer "above average."
This is a problem that prevents programmers from learning new technology and methodology. In our case, it hinders positive attitude to the static code analysis. It is very unpleasant for a programmer to realize that some program will teach how to code. It is even more unpleasant when the analyzer detects stupid errors in the ideal code and makes them publicly available. It takes some will and wisdom to note and accept faults and defects in the code. It's much easier to write some negative feedback about the tool and in general about some technology and continue living in the comfortable and closed world.
The analyzer PVS-Studio finds bugs in the code of such projects as Qt, MSBuild, LLVM, GCC, GDB, and Chromium. Developers of these projects cannot be rated as below average. However, this doesn't prevent the programmers to answer that their code is very qualitative and that the analysis is not relevant for them at all.
I like asking in such cases: who made these 11000 errors, if there are only "above average" professionals everywhere? The question is, of course, rhetorical, and I'm not waiting for an answer.
I think managers are realizing what I am getting at. Static code analyzers are essential during the development of medium and large-sized projects. It helps to fight against a lot of errors and control the development process in general. Regular checks help to detect abnormal growth of the number of bugs, caused by the fact that there is some programmer who is about to quit the job and writes sloppy code, because he doesn't care much, but has to pretend that he is working. This situation was made up, of course, but it is really necessary to have a tool that can assess the quality of the project and the analyzer warnings is one one of the best metrics.
By the way, here is an interesting story on the topic. Quite recently, my colleague checked CryEngine V project and found a lot of bugs. There were so many High-level warnings that my colleague didn't have enough patience to have a look at all of them. Then, all of a sudden, we learn that Crytek has been experiencing some problems and several programmers aren't getting their salaries for more than three months already. An unhealthy situation in the company affected the quality of the code. It was interesting to see such a clear relationship.
In general, you should not hesitate to force programmers to use static code analysis. Let it be not PVS-Studio, but some Cppcheck (that does simple checks, but for free). This will already be great. The programmer may start objecting, so the manager has to remain firm. Often the programmers refer to the fact that the work with the analyzer takes time, forcing to view false positives.
I cannot help being sarcastic here. Oh, well... To spend a day setting up the analyzer and suppressing the false positives is too much. At the same time to sit and look for the bug for a week is totally fine.
Proof: a bug that took about 50 hours of unsuccessful search was found and detected in less than an hour.
By the way, it's very simple to integrate the analyzer into the development process if there is no need to look for the errors in the old code. PVS-Studio can show errors that are related only to the new or the edited code (see Mass Suppression of Analyzer Messages).
The managers should be very skeptical about the objections of the programmers of why they do not need a static analyzer. They may really not need them. It is necessary for the project. This is one of the methodologies, like unit-tests, for example, that allows increasing the reliability of the code and reducing the expenses for its maintenance. The quicker the errors will be found, the better. Static analyzers help to detect errors at the earliest stage, i.e. once they appear in the code.
Figure 2. Please note, some programmers use static analysis in the wrong way.
Another important point: some programmers may claim that there were few errors found during the testing, so the implementation of the analyzer is not rational. Don't let them confuse you. Of course there won't be a lot of bugs in the working code; otherwise, the program won't work. However, finding and fixing bugs can be very expensive. Often, a lot of user complaints and hours of meditation of the programmers in the debugger could be avoided just be one run of the code analyzer. The whole point of and the use of static analysis is in its regular use, rather than in occasional runs.
I've heard several times that some programmers run static analyzers before the release. If there are people who do this and state that this is normal, they are not suitable for this profession. This is the most incorrect way of using the static analyzers. It is the same as turning on the compiler warnings before the release, but disable them during the rest of the time.
Thank you for your attention. I suggest everybody who is interested in the methodology of static analysis in general and in PVS-Studio analyzer in particular, contact us. We can help to check your project, set up the analyzer and show how to deal with the issued warnings.