Agile Methods - Delivering Software Faster
A core tenet of agile methodologies is to help teams deliver software more quickly. But with the plethora of agile practices available to choose from, teams new to agile struggle to find the right practices that help them realize speedier delivery. In coordination with the just published Agile refcard, DZone had the opportunity to chat with Amr Elssamadisy, Partner at Gemba Systems and author of "Agile Adoption Patterns: A Roadmap to Organizational Success", about agile and the promise of speedier delivery.DZone: Iterations may be anywhere from 1 to 4 weeks. Is there value in shorter vs. longer iterations, or is this largely contextual?
Amr: Iteration length is largely contextual. There are smells that indicate that your iteration is too long or too short. For example, if your setup and teardown time for an iteration (kickoff meeting and demo) are taking a significant percentage of your iteration, you might want to consider lengthening it. If, however, you find that your estimates are consistently off, or that you are not getting enough feedback you may want to make the iteration much shorter.
DZone: Is it really necessary to release often? Or is it most important at the end of an iteration to have deployable software, though it may not necessarily be deployed? There may be some confusion surrounding what is meant when saying "release your software to your end customers as often..."Amr: None of these practices are *really necessary*. They are very helpful in increasing time to market if that is your goal. And, if you want to get to market fast, then as the Nike commercial says, Just Do It! No matter how close you think you are to deployment, there is no substitute for really deploying and getting your customers to use your new software (and give feedback) faster. You'll notice, for example, that Amazon and Google do this all the time.
DZone: Isn't there a strong relationship between the technical expertise, and the technically difficult practices, and greater IT/business alignment?
Amr: Technical expertise gets you and your team to being "well-oiled", and that definitely goes a long way in enabling business/IT alignment. The research cited in the refcard supports that and strongly suggests that you start with technical expertise to enable alignment. Now, if you mean are the technical practices, such as automated developer tests, more important than the non-technical practices, I'd have to say it depends on how short-term your needs are. Technical practices typically take longer to adopt and have more long-term value and less immediate value (all of this with respect to time to market only) than the non-technical practices.
DZone: What techniques can I use to help understand which 45% of features are never used? Can you elaborate further?
Amr: An onsite customer is a first step. Having a cross-functional team that works within an iteration from a prioritized backlog that releases early and often (not just gets to done state) is another step. These two things will give you feedback. You need to then, as a team, inspect and adapt. To be blunt, however, this is not something that we have a lot of rigor and experience around in the Agile community and I expect to see much more on this subject in the future.
DZone: What impact on organizational culture may adoption of these practices have? Are there techniques to help alleviate any challenges?
Amr: That question is out of context for the card and really needs a book or two on its own. I'll refer the reader to the first section of Agile Adoption Patterns, especially the chapters on Personal Agility and Learning is a Bottleneck. If anyone is interested in more information on this topic, send me an email at firstname.lastname@example.org and I'll be glad to forward several articles on the subject .
DZone: When talking about Done state, what's the role of testing in determining Done?
Amr: Done state, as defined here, is an agreement among the members of a self organizing team. It should be as close as possible as they can get to deployable software. That varies from team to team depending on their starting point. At the minimum, all requirements should pass acceptance tests - whether manual or automated.
DZone: What role does the customer play in determining the Done state?
Amr: If there is a customer that is part of the team, absolutely. They are responsible for defining the acceptance tests that must be passed for something to be considered done.
DZone: Is it possible to adopt these practices without an on-site customer?
Amr: If you look at the dependency graph for the practices that affect time to market, the ones that are not directly or indirectly dependent on onsite customer can be adopted without one. These include Simple Design, Refactoring, Automated Developer Tests, Done State, and Cross functional team. The rest, however, are dependent on onsite customer. That means that they need an onsite customer to be most effective, not that they cannot be done without an onsite customer.
DZone: If the software works, but has poor design, isn't there a cost to refactoring that can actually slow the team down?
Amr: Absolutely, refactoring can slow things down in the short run. Over the long run things get much faster. That is the same with automated developer tests and automated acceptance tests. The technical practices, as a rule of thumb, can slow things down before they start to get faster. On the other hand, every day you don't start addressing your technical debt - what you called poor design, is a day that you are digging your hole a little deeper and making things a little slower.
DZone: Thank you, Amr, for taking the time to chat today.
Amr: Thanks for giving me the chance to answer these questions.