Over a million developers have joined DZone.

Code Quality Tooling: Pure Player vs All-in-One?

DZone's Guide to

Code Quality Tooling: Pure Player vs All-in-One?

· DevOps Zone ·
Free Resource

Read why times series is the fastest growing database category.

When talking about equipping developers to control code quality, we often enter a passionate discussion between operationals and managers: should we deploy a unique tool that is able to manage a wide variety of technologies and quality domains, or do we need a “pure player” tool dedicated to each technology’s specificities? In my humble opinion, I would say “both!” In this article, I am going to discuss about the benefits of a unique tool that would integrate specificities and coding rules for each technology and quality aspect.

On one side, there are people against, frightened by the aberration of having only one tool to control and manage the code quality of the whole projects portfolio. In this group, probably composed by a majority of developers, we usually hear some of these sentences:
  • My software development project is unique. It deserves a specific tool to control each part of app's quality. For example, FindBugs will be more relevant to control code reliability and failure risks of my binaries, while PMD will help me to respect common best practices for Java.
  • I am addicted to JSLint. I’m on the top with this tool, I just can’t use another one. I would be less efficient. So, if you want me to deliver the project on time and with quality code...
  • With an "All-in-One" tool, we will never get the appropriate accuracy of analysis for our “in-house framework” written in Ceylon (ouch!). We will never get the coding rules that highlight real code quality issues.
Of course, on the other side, there are supporters of a unique tool that centralizes features and coding rules to control code quality. In this case, arguments in favor of such approach are more economical and managerial than purely based on technical thoughts. Nevertheless, they also make sense:
  • A unique tool enables to construct homogenous quality indicators, those can be compared between projects.
  • A unique tool is more economical to maintain.
So, what is the best approach? In fact – and this is the main meaning of this article – everybody is right. In fact, the key to conciliate both approaches is the integration capability of the tool. In fact, that’s exactly what Scertify does! There-after, we’ll try to develop 5 of the key benefits induced by the deployment of a unified code analysis tool.


Leverage the best rules from each tool through a unique interface

Today, open-source static analysis tools like PMD, Checkstyle, Findbugs, PHP Code Sniffer, JSLint, you name it, have become references. I believe it would be an error to use (or impose) a unique commercial tool that would not integrate those references. One would deprive himself the benefits of several man-years of research and development in this domain. That's why since version 1.10.1 Scertify integrates 5 of those tools:
  • PMD, Checkstyle and FindBugs for Java
  • Closure Compiler and JSLint for Javascript.
Even if some of the tools' rules overlap (e.g. ClosreResource from PMD and ODRE_OPEN_DATABASE_RESOURCE from Findbugs), the integration of those rules enables to leverage a catalog of more than 1,600 rules. Of course, not all of them should be used in a quality profile, or one would get mad trying to correct all of them. Nevertheless, this catalog allows to build a starting quality profile that covers a significant part of issues related to various domains of quality. For example, FindBugs' rules are more oriented toward robustness (e.g. BC_IMPOSSIBLE_CAST, J2EE_STORE_OF_NON_SERIALIZABLE_OBJECT_INTO_SESSION, …), while Checkstyle leans toward maintainability and readability (e.g. AbstractClassNameCheck, ConstantNameCheck,...). Scertify enables to check all those rules without having to run successively each tool.

Ensure a minimal quality level for all projects

A unique tool makes it easy to deploy a minimal yet mandatory rules repository for all projects of a same language. Those rules are easier to apply since the definition and the verification is performed by a unique tool. The minimal repository, which could contain no more than 20-30 rules in the beginning, can then evolve to progressively integrate more quality controls:
  • Good practices related to specific frameworks (Hibernate for example) ;
  • Exceptions handling ;
  • Maximal complexity allowed for a method ;
  • Naming conventions used across the company.

A unique tool for multi-technologies projects

In the J2EE world, and more generally in software industry, developments are not performed with a unique language. In a same project, Java components work together with Javascript components, SQL components and many more. So, having only one tool to control the quality of codes from all those languages is a real advantage. There's no need in going from one tool to another to identify problems, everything can be performed from the same interface. Just imagine the comfort when you can use only one tool to control the code of a web applications with a Javascript front-end and a Java back-end that accesses a database through Hibernate.


Coherence between IDE actions and continuous integration reports

A homogeneous quality control between the developer’s IDE and the reporting generated by the continuous integration guarantees the efficiency of action plans. In many cases, action plans that aim at improving code quality are set-up on each sprint. Those action plans are often based on reports generated by tools used in continuous integration (Sonar, Squoring, Scertify...). If the rules used are synchronized with the analysis performed in the developer's IDE (for example with Scertify's plug-in for Eclipse) this is far easier for developers to efficiently correct defects. He does not have to go from the report to his code to find the violation: everything is directly integrated in its IDE. Tell me, would you prefer to use an old road map to find your itinerary when there is a GPS available?


Reduce training, deployment and maintenance costs to a unique tool

Obviously, the use of a unique tool enables to significantly reduce training costs required to efficiently leverage a solution. Furthermore, all features evolutions of the tool will be synchronized for all users. Finally, the integration of this tool to other solutions commonly used by developers, like Eclipse and Maven, heavily speeds up the learning and acceptance curve. From my experience, those training costs, even if they can be small for a unique project, are far from negligible for a whole applications portfolio. They also play a key role in the adoption of the tool and the success of the quality approach. This would be an error not to take them into account and this will be the topic of an upcoming series of articles on this blog.

Learn how to get 20x more performance than Elastic by moving to a Time Series database.


Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}