Some time ago, I organized a talk on “code quality and why developers should care about it” for my company. In that presentation, I demonstrated some tools, unfamiliar to the audience of young developers and battle-hardened IT developers alike. One among these tools has brought me a lot of questions: Codacy. I will present its features and perform a comparison between Codacy and SonarQube from SonarSource.
How is the choice of an application quality measurement solution influencing the way software teams work?
Disclaimer: This article is not endorsed by Codacy. It reflects my personal opinion and experience.
According to a study ordered by Codacy, Codacy’s product may improve the code quality perception up to 20% and optimize the code review process up to 30%. We can assume that it reflects the disruption brought by new code quality solutions in contrast with traditional ones. The power of the code analysis also seems to be a minor criterion in the choice of a solution. Interviewed users are justifying the ROI of such a solution with its ability to ease and optimize the software development process.
Codacy is a company that started out as Qamine with funding from Seedcamp in 2012. Its clients include Adobe, Deliveroo, Intel, and Paypal. More than 30,000 developers are using their products daily and more than 1,000 companies use their services. More than a billion of lines of code are analyzed per day.
Here are Codacy’s main features:
- Code review automation.
- Code quality analytics.
- Security code analysis.
- Cluster installation/multiple instances.
The first time I tried Codacy, I signed in using my GitHub account and launched some analysis on my public repositories.
At first glance, we can observe that most of my repositories were analyzed without a line of configuration. Codacy is detecting most of what it needs automatically- handy.
One of my projects has not been analyzed. This project was developed in Groovy using a gradle packaging script.
Officially, the supported languages in Codacy are
According to Codacy, the language support includes several community plugins working out-of-the-box:
- Shell Script
For each one of my projects, Codacy provided metrics on the quality of my project.
One of Codacy’s best features is the “per-commit strategy.” Codacy triggered an analysis for every commit of my repository. It even analyzes the past commits of my repository to draw the tendencies over time. For each commit, we can see the evolution of the quality and number of created/deleted issues…
The dashboard is clean and really easy to use. The widgets provide the necessary information to assess the present project quality. Both metric and violation statistics are integrated to build a simple qualimetry model (Complexity, Style, Compatibility, Error Prone, Performance, Security, Unused Code). This qualimetry model is specific to Codacy even though it is close enough to the ISO 250001/9126 specification. Codacy does not seem to offer any SQALE/ISO or Security standard models.
A specific dashboard in Codacy allows developers to evaluate the security characteristics of their projects.
Codacy is strongly mapped to your repository and files. Most of the metrics and violations are associated with your files, like in this screenshot. Abstract representations like packages, modules, and namespaces are not represented and the analysis tends to be complex for large projects.
Each piece of information is clearly reported and each file receives a score based on a letter from A to F.
The new dashboard better represents your code quality data and basically answers two important questions:
- How does my code quality look right now?
- How does my code quality evolve over time?
To answer the first question, you have the project grade (A to F) and the four main metrics (static analysis issues, complexity, duplication, and coverage). Each issue is in one of six categories (Security, Error Prone, Code Style, Compatibility, Unused Code, and Performance). The overall technical debt is also estimated.
A nice feature in Codacy is that duplicate code is easy to track and to follow, like in this screenshot:
An important feature in Codacy is the possibility to natively handle branches, tags, and other important things provided by your SCM. It won’t be necessary to create several projects for handling the branches or to go through tricky project naming.
Codacy can be used as an efficient quality gate to accept or deny your team commits or other external contributions. It becomes very handy with the mechanism of triggers that control the code quality when the push has been submitted. Whether you are using Github, GitLab, Bitbucket, or your own repository, Codacy provides some integration to perform a priori checks of your commits and pull requests.
Another interesting feature is that your pull requests can be analyzed and a quality report inserted into the PR discussion.
In Codacy, we also have an interface to manipulate and configure the rules used to analyze your code source. However, the functionality is lacking flexibility (parameters seems to be missing from the rules) and the rule documentation has not been imported from the 3rd add-on.
SonarQube has mostly deleted the possibility to evaluate the tendency of your code quality (remember the little arrows) and the default charts to read the metric evolutions from their default dashboard. While you may customize your dashboard in SonarQube, Codacy provides an interface to efficiently track the evolution of the quality of your projects and locate the source of your defects per commit, tags, and so on.
The line graph shows the evolution of every metric over the past 31 days, and the commits that have been made. In addition to that, you can define a standard value for each metric, which represents your expected code quality (e.g. not more than 20% issues, not less than 70% coverage, etc). You can think of it as the quality gate/threshold for your project.
After going deeper into Codacy, I think that their developers recommend that you include your configuration files directly in your GIT repository. The configuration can be easily tracked thanks to the versioning of your configuration (and as a bonus, to be shared with your IDE).
SonarQube vs Codacy: An Alternative?
I am a frequent user of SonarQube and for that reason, I wish to share my personal comparison between the two solutions to help anyone who would hesitate for another solution.
For a long time, SonarQube has been the easiest way to perform code quality analyses. Some years ago - an eternity in this domain - tools like Squale required a long and difficult process to tune, configure, and finally obtain the code metrics. SonarQube brought a new model, “code analysis as a service,” where all details from the project creation could be either imported from build scripts like Maven or modified a posteriori. This feature has been the great force of SonarQube.
However, this approach also contains a major drawback. SonarQube is lacking the adaptability to be used as a quality gate. An example? The possibility exists to use SonarQube to perform analyses per commit. It’s a clever trick using your CI Build like Jenkins. However, SonarQube won’t hold any metrics per commit, tags, or branches. Branches are a rather new feature in SonarQube, though its usage is quite cumbersome, even more so to perform the comparison between branches (link1, link2). At the end, technologies like Git or Bitbucket have created the need for tools that produce metrics for each branch, commit, and pull-request to help the product owner.
Security Code Analysis
Security and vulnerability detection is a tough problem to solve with static analysis tools. This niche market is already populated with well-established leaders (Coverity, Fortify, Klocwork) These kind of tools are really difficult to implement: misprediction, lack of information, and the complexity to understand and fix the issues are common issues.
Codacy is therefore not a new security code analyzer. Codacy integrates many existing tools and accumulates all analysis results in one interface. The diagnostic is built with the data from the plugins and other integrated tools inside Codacy.
Language Coverage and the Power of the Static Analyses
At the first glance, the coverage by Codacy is quite impressive. Codacy is well integrated with many open-source tools.
SonarQube is beating Codacy by offering companies several language supports for proprietary/specialized languages like Cobol and ABAP.
That is a crucial difference between the two solutions. If you are developing legacy software or vendor-locked languages, Codacy is probably not your solution. If you want an application portfolio analysis that offers you a score no matter the languages present in your code, SonarQube will be the best at this task.
However, if you are developing new software with trendy languages (TypeScript, Scala), Codacy is a better fit. Even if SonarQube is developing their own code analyzers, usually more powerful than the open-source alternative, the consequence is that SonarQube is less reactive to the new languages and frameworks. Several open-source plugins may complete the SonarQube possibilities but it requires that your administrators allow it. That’s the main restriction that prevents some SonarQube users to be able to analyze their new projects.
Along the SonarQube releases, the UX has greatly evolved from a simple but efficient GUI to a highly customizable product allowing Software Quality managers to build their own dashboards.
The usage of SonarQube becomes more complex in the recent releases (6); it is complex to obtain the full list of violations and to browse the list of rules violated. These are some use cases difficult to reproduce without using the REST API. SonarQube is in a full transition that slightly deteriorates the user experience.
In contrast, Codacy has been designed as a simple but efficient UI to see your data, your indicators simply. It’s easy-to use by a developer team, in a agile context that does not want to spend several hours to align two widgets.
SonarQube or Codacy? I think both are answering two different usages and expectations.
SonarQube (at least until its Cloud-based offer becomes mature) has been an on-premise solution, giving the power to companies to analyze their projects, whatever their organization could be. SonarSource has developed proprietary code analyzers to extend the possibility of their platform to better fulfill their customer needs: security, vendor languages, qualimetry models…
The strengths of SonarQube are proprietary language support, market leadership, and an open-source/proprietary licensing model. Its weaknesses are an expensive/slightly complex licensing model, a complex UI, an on-premise solution, and possibly a lack of responsiveness to the market needs.
Codacy, in comparison, has been designed as a Cloud-based solution thought to fit the needs of remote software developers, using mainly Github, Bitbucket SaaS service, working, building, communicating, and deploying into the Cloud. Their solution is deeply mapped to the organization of your code repositories.
Codacy’s strengths are that it's easy to use, has a friendly UI, the possibility to analysis every contribution instantly, and the cheapest price. This solution also provides a good integration of the available open-source code analyzers.
Codacy’s weaknesses are the lack of integration of other SaaS services (Blackduck, Sonatype, UI/E2E testing Saas services or API QOS metrics from AWS API Gateways), the complexity to contribute to the ecosystem, the relatively small community, and the impossibility to cipher the project information or limit the access to the source code in the UI.
I hope that this article has been useful for you. If so, please write a little comment or please share it with your colleagues.