Martin Fowler distinguishes prudent from reckless technical debt. Prudent debt results from thoughtful trade-offs such as omitting input validation and exceptions handling to get customer comments sooner. Reckless debt results from ignorance.
We distinguish between prudent and reckless Agile development. Reckless Agile creates a lot of unintended and unnecessary technical debt by failing to deal effectively with crosscutting goals. Crosscutting goals are those that affect the application architecture or multiple domain functions. Most quality goals and system functions e.g. logging, are crosscutting. Crosscutting goals are incompatible with iterative development because they can’t be satisfied in a single iteration.
We define “Prudent Agile” as an Agile hybrid containing two phases. The first phase identifies and plans for crosscutting goals, while the second is “pure” Agile development focusing on domain functions.
The first phase deals with crosscutting goals such as reliability, understandability, or response time, that matter to your application. This phase contains the following steps:
1. Identify “Quality champions” for the team
2. Identify relevant crosscutting goals and their acceptable quality levels during a workshop
Some crosscutting goals are universal i.e. relevant to most domain functions. These include: reliability, response time, modularity, ease of use and learning, and all basic qualities (compliance, sufficiency, understandability, and verifiability).
The remaining (nonuniversal) crosscutting goals are reviewed to identify those that matter to your application.
Traversing a comprehensive quality model will speed this step
3. Identify supports for the selected quality levels
Quality supports may include: (1) other qualities e.g. security supports privacy (2) system functions e.g. exception handling and logging (3) rule sets e.g. coding standards, and their associated compliance assessment software, and (4) warning labels e.g. “Are you sure you want to delete?”
The following qualities are software quality engineering subfields:
There is a significant body of specialized knowledge associated with each. If you aren’t working in a prearchitected environment and you don’t get help from individuals who have such knowledge, you can’t learn what you need to know during a project.
Supports for each quality level should be evaluated for adequacy in a technical review.
Conflicts between quality levels or supports should be resolved by avoidance or prioritization.
Estimate the effort associated with implementation and verification of system functions and verification of rule compliance and warning labels.
4. Create master lists of supports for the crosscutting goals
Start with master lists of supports for the universal crosscutting goals and develop the supports for the added (nonuniversal) crosscutting goals that matter to your application.
These lists will guide developers as they program and test their domain functions and reviewers as they evaluate the results.
5. If necessary, create a backlog of system functions needing development (or major changes).
The backlog of system functions e.g. exception handlers, can be reduced during the iterations of phase two. The longer it takes to empty this backlog, the greater the technical debt.
6. Select the application architecture
7. Assess project risk during a workshop
a. Hazard analysis
b. Mitigation identification
c. Residual risk prioritization and assessment
8. Continue or postpone project based on residual project risk, list of project mitigations, and backlog of system functions
During the Agile phase, consult the support master lists at the beginning of each iteration and as a part of the acceptance review at the end.
Prudent Agile is based on the following assumptions:
1. Identifying relevant crosscutting goals only needs an understanding of the type of application (e.g. flight control or e-commerce) to be acquired and not the details of its behavior
2. Cost-effective iterative development should be aware of restrictions from crosscutting goals early in each iteration
3. With experience, master list development takes no more than a week. At first, it take 2 to 3 weeks.
Questions and comments are most welcome.