Refactoring: How Do Agile and DevOps Processes Affect Software?
See what research shows about DevOps and Agile's impact on software quality through continuous refactoring.
Join the DZone community and get the full member experience.Join For Free
Agile and DevOps adoption continues to accelerate and scale across organizations, yet the question many executives, and researchers, are asking is: Are Agile and DevOps practices improving the software itself?
To offer deeper insight, Carmine Vassallo, Fabio Palomba, and Harald C. Gall of the Department of Informatics at the University of Zurich, have released several studies that look at DevOps and Agile outcomes by examining the impact of continuous refactoring on software quality.
Their paper, An Exploratory Study on the Relationship between Changes and Refactoring, analyzes refactoring during the evolution of a software system to help understand which code components are likely to be refactored. While in their follow-on paper, Continuous Refactoring in CI: A Preliminary Study on the Perceived Advantages and Barriers, the authors seek to understand how developers perform refactoring and the pros and cons of adopting Continuous Refactoring. Specifically, they dig into the common perception that developers understand the value of refactoring but are reluctant to do so.
The authors further enhanced their findings by adding a qualitative research component that analyzed source code extracted from an open source repository, identifying projects that employed both a Continuous Integration and a Continuous Code Quality platform, specifically Travis CI and SonarQube.
Why Developers Do and Do Not Refactor
The Zurich team identified that removing duplicated code, improving readability, and addressing technical debt were the most popular reasons why developers refactored.
- 74% of the cases the main reason for developers to re-organize the code is the presence of duplicated code.
- 96% of the cases the overall readability of refactored classes is improved by 48%.
- 46% of the refactored classes have been affected by a self-admitted technical debt in their previous versions. Therefore, most of the times refactoring is a form of compensation of pre-existing debts.
They classify refactoring motivations into three areas: code quality improvement, better code comprehension, and avoidance of quality gate failures.
Developer Perceptions on Refactoring
- 29% of the participants were less convinced about the need for continuously refactoring the code.
- 16.1% said that Continuous Refactoring is maybe needed and 12.9% that it is not a crucial activity at all.
Their result aligned with previous findings regarding developers “refactoring to understand” attitude. Essentially, developers tend to focus on documenting and re-organizing code to improve readability and comprehension versus addressing software quality issues or concerns.
The team identified two significant motivations behind developers’ attitudes toward refactoring: (1) risks associated with the restructuring of a portion of source code and (2) effort required to apply the transformation. There’s the belief that “continuous program transformations can decrease the understandability of the overall architecture of the system.”
Refactoring, Automation, and Understandability
While refactoring techniques vary based on the languages and context developers are working in, the definition of Code refactoring is the process of restructuring existing computer code without changing its external behavior. Refactoring is intended to improve non-functional attributes of the software, make code more readability, reduced complexity all leading to more maintainable software and create a more expressive internal architecture or object model to improve extensibility. As Martin Fowler explains, “Refactoring isn’t another word for cleaning up code — it specifically defines one technique for improving the health of a code-base.”
While there are many techniques defined as refactoring techniques, such as allowing for more abstraction and breaking code apart into more logical pieces, the research illustrates that majority of refactoring focuses on improving names and location of code. They propose that one reason is that developers lack proper automated tools while refactoring; tools that help overcome their fear of breaking the code or introducing bugs while refactoring.
One of the more promising DevOps trends is the increasing adoption of Continuous Integration (CI), a development practice aiming at continuously building new software, which can make the identification of bugs as well as improvement of code quality easier. “CI has the potential to change the way software code quality assessment and refactoring can be applied in practice.” 2 CI is promising because, through tooling, developers can define and automate software quality gates as part of their delivery pipeline. A software quality gate is a set of constraints, determined by the organization, on the quality of the software that is expressed through thresholds on specific metrics. Software quality gates are a well-known way to control software degradation; if a newly committed change fails a software quality gate, the developer acts to resolve.
Should Bad Code Stop the Delivery Pipeline?
While the Zurich team suggests that there is a lack of effective refactoring tools that prevent the broader adoption of Continuous Refactoring, our experience is that there is organizational resistance against ‘breaking’ the DevOps pipeline or slowing down release cycles. This mindset is driven by the current thinking about DevOps key metrics, specifically Deployment Frequency, and Lead Time for Changes. As Mr. Fowler points out these are IT delivery-centric measures. While they have value to an organization, the inclusion of product-specific measures would create the opportunity for a decision regarding when to ‘break’ the pipeline, especially if the software built does not meet the organization’s stated security, reliability, and overall quality goals.
High-performing organizations are quickly embracing the ideas of including product-based measures and software quality gates within DevOps pipelines. In IEEE Software Magazine, Fannie Mae documents how an automated structural quality analysis within their Agile-DevOps methodology resulted in 21 times (+19,000) more builds per month with half the previous staffing while experiencing a 30-48% improvement in overall application quality and 28% improvement in team productivity.
While Agile and DevOps have indeed revolutionized how software is developed and delivered, there is much room for improvement as to why organizations adopt these practices and to answer the ultimate question of any process improvement: Are we getting better?
As we near the next phase of maturity across the industry, more organizations and business leaders will look for this answer and, of course, ask the next question: How can we get even better?
The Zurich team has started the investigation into how to answer this question, but more work is to be done. In my next post, I will share how the Zurich team evolved their investigation into Agile & DevOps processes and its impact on software in a study addressing Continuous Code Quality.
In the meantime, I look forward to hearing about your experience with refactoring:
Are your teams doing it?
What is working best?
What obstacles have you overcome? And what remains?
Published at DZone with permission of Pete Pizzutillo. See the original article here.
Opinions expressed by DZone contributors are their own.