Sprint Review Is Not a Phase Gate
Sprint Review Is Not a Phase Gate
You don't have to wait for the sprint review to promote to production; the review exists to determine the most valuable thing to do next.
Join the DZone community and get the full member experience.Join For Free
10 Road Signs to watch out for in your Agile journey. Download the whitepaper. Brought to you in partnership with Jile.
Throughout the Agile Alliance 2016 conference, I was struck by a recurring feeling that many people don’t understand what the Sprint Review is. This feeling has continued as more articles and conference proceedings have emerged. Everyone seems to think that you can not release software continuously while doing Scrum. For example, in his keynote titled ‘Modern Agile’, Joshua Kerievsky described Scrum as old fashioned because of its lack of support for continuously delivering software. He implied that modern approaches remove the idea of a Sprint and move to a continuous flow model. And he says that Scrum does not support the idea of Continuous flow and delivering software frequently.
I want to dispel that myth and highlight why we have a Sprint Review, why you can deliver software to production multiple times (continuously) during a Sprint and how in fact the best Sprint Reviews happen with software in production being used by real users.
Firstly, let’s look at how the Sprint Review as described in the Scrum Guide.
A Sprint Review is held at the end of the Sprint to inspect the Increment and adapt the Product Backlog if needed.
There is nothing stated that the increment can not be in production for the Sprint Review. The Scrum Guide avoids defining what state the increment is in. The state of the increment is defined by the team in their definition of Done. In the training materials used by Scrum.org Professional Scrum Trainers, the definition of Done is described as ideally being software running in production. In fact, our Professional Scrum Developer class is all about removing roadblocks to enable software being in production. This focus on really Done software is described in the Scrum Guide when discussing the elements of a Sprint Review as:
Review of how the marketplace or potential use of the product might have changed what is the most valuable thing to do next.
Let’s zero in on why we have a Sprint Review.
At the heart of Agile methods and Scrum is an empirical approach. Empiricism replaces the idea that you can plan your tasks out in great detail, instead focusing the team on building small increments that drive learning and understanding. From that learning and understanding, you can then adapt, which will often cause changes to the work that you do next or how you do it, based on what you have now learned. This is reflected in the product backlog. The bottom line is the Sprint Review provides a structure for inspection and adaption at the boundary of the Sprint. The Daily Scrum provides a daily inspect and adapt loop which is focused on the team. The combination provides a framework for an empirical approach to work.
That means that Continuous Delivery makes perfect sense for Scrum and would be reflected in the Definition-of-Done (DoD). As each Product Backlog Item (PBI) is Done it would enter the release process which would be documented in the DoD. With luck, that process would be automated allowing the final stages of the push to Done to comprise a series of automated scripts as the software moved through the staging areas to production. In the case of complex products with multiple teams (dare I say a Nexus), the definition would not only push the PBI to done, but also integrate it with other Done PBI’s. This can occur many times within a Sprint or once, Scrum doesn’t care, what is important however is that we learn from what we have done, adapt in the future from that learning and deliver value to our users.
Is Continuous Delivery Mandatory for Scrum ?
The short answer is ‘it depends’. As a software delivery professional, you have a number of levers you can push and pull to ensure that you are delivering software of high customer value and low organizational risk. One lever is how frequently you release the software to production or into customers hands (they are not the same thing, the use of feature toggles allow you to release software that is not actually switched on for the customer). How often is determined by understanding value (both material and learning) and risk (how sure do we have to know this works and what is the impact if it doesn’t).
The other lever is how frequent you run the team inspection and adoption (also known as the Sprint). If your Sprints are short then you benefit from frequently reviewing your knowledge of the situation which allows you to adapt. But, there is an overhead to that review and that must be balanced with the value you get out of it. Also, it is possible that some features still will not have been used by customers, or the stakeholders need longer cycle times to understand what you have delivered which makes for a longer Sprint. Of course in an ideal world, it is a choice. A choice to deliver continuously or less frequently. This, however, is not the case for many Scrum Teams where ‘DONE DONE’ is a dream, not a reality. And getting software over the production line still requires massive amounts of energy from multiple teams, departments, and even external organizations. In those situations, the Sprint Review is a review of software running in some pre-production environment, not ideal, but still better than not reviewing it at all. And with luck the organization is reducing the gap between in production and what is reviewed in the Sprint Review.
Ultimately Scrum is about providing a framework that allows explicit decisions to be made with real information. The Sprint Review, like the daily Scrum encourages the team to review what is happening which allows the team to adapt. Continuous Delivery increases transparency and thus improves the ability for the team to adapt based on real information.
Published at DZone with permission of Dave West . See the original article here.
Opinions expressed by DZone contributors are their own.