At Amsterdam Airport Schiphol, we’re using an Agile approach to realize a large digital program. This program includes five value streams with multiple teams. Due to the increasing scale of the program, some challenges arise, such as how to organize a Sprint review with an increasing amount of teams and stakeholders and preserve a valuable outcome.
In this blog post, I’ll describe the challenges and experiments we might try to deal with them. Some of these experiments are ideas proposed by Preeti Gholap. She answered my question I posted on LinkedIn some time ago. Thanks, Preeti!
The purpose of the Sprint review is to gather feedback on the delivered product and evaluate the collaboration. The Sprint review should be used for a demonstration and inspection of the developed increment. It’s the best opportunity to adapt the product backlog if needed and release the increment to production if the Product Owner finds it useful enough. It’s the ideal moment for a joint reflection and to decide how to proceed to optimize value.
The Bright Side
Earlier on, I wrote an optimistic article about the Sprint Review at Schiphol. The article, 5 Characteristics of a Great Sprint Review, offers five examples of what is going well:
1. Stakeholders Are Difficult to Recognize
For sure they are present, but they blend themselves between the Scrum teams, making it one big collaborating group of people.
2. Every Developer Participates
Yes! Every developer was there, most of them holding sticky notes on which they wrote down the feedback they received on their product.
3. Feedback. Feedback. Feedback
In short, the Sprint review was one big feedback party. Every Scrum team provided demonstrations on several devices. Everyone could actually use the product and share experiences and lessons learned.
4. A Tailor-Made Sprint Review
A great Sprint review has a tailor-made format. Sometimes, using different market stalls for every team is the best format, sometimes a central demonstration and discussion works best. A great Scrum team continuously searches for the ideal format to gather feedback.
5. Beer and Bitterballen
The follow-up of the Sprint Review is, of course, the Retrospective. The ideal opportunity to process the Sprint review and discuss possible improvements combined with beer and bitterballen.
The Dark Side
The increasing scale of the program, however, has an impact on the Sprint review. Due to the increasing amount of participants, some pitfalls start to appear. There are three main problems.
1. The Sprint Review Becomes a Demo
It’s not a feedback party anymore, but a demo festival. To be honest, it was already called a “demo,” but it had all the characteristics of a Sprint review. There was a focus on gathering feedback and collaboration. However, nowadays, the large amount of one-sided demos starts to exceed the opportunity of gathering in-depth feedback.
2. Stakeholder Abundance
Is it possible to have too many stakeholders at the Sprint Review? Ideally, no. However, it can become an issue when the amount of irregular visitors makes its imprint on the session. These “one-time” visitors expect an explanation of the project/product as a whole, not necessarily an update about the previous Sprint. This is not only time-consuming but it also doesn’t add any value for the Scrum Team, who wants detailed feedback on their previous Sprint.
3. Developers Are Not Participating Anymore
As a consequence of the lacking feedback, not every member of the Scrum team attends the Sprint review anymore. The Scrum Master and Product Owner start acting as the ambassadors of the Scrum Team. They are becoming the hub between the stakeholders and the development team. The valuable dialogues between the stakeholders and the development team are diminishing.
Ideas for Experiments
1. Organize a Monthly Demo Besides the Bi-Weekly Sprint Review
For sure the demo is valuable. But it has a different goal compared to the Sprint Review. The primary goal of the Sprint Review is gathering feedback. The goal of the demo is creating alignment between everyone involved or interested. Organising a monthly demo with a focus on alignment and a regular smaller bi-weekly Sprint Review (maybe per value stream) with a focus on gathering detailed feedback might be a good solution.
2. Organize Small-Circle and Large-Circle Two-Part Sprint Reviews
This is a solution suggested by Preeti Gholap.
I started this when I was coaching six teams that needed to do a joint demo, in a situation when we only had limited face-to-face access to stakeholders (who had to fly in from several countries).
Each team uses the first part of Sprint review (say 40 minutes for a two-week sprint worth of stories) to get detailed feedback from their own small circle of mandated users and accepters. This is a small group of four to six people with whom the team and Product Owner works with closely for both backlog and user story level refinement and acceptance (and thus, full cycle engagement). The feedback we’re looking for here is: “Is this usable?” and “Does it meet all the acceptance criteria, functional, nonfunctional, for the users and for operations and running?”
Then the teams proceed to the joint demo, or the large circle, where each team presents a summary of the increment and its value (thus not the details per story) to the other teams, management, and wider stakeholders. The feedback asked for is at the level of, "Is this valuable and contributing to the whole?" and "Are we going in the right direction? Are there upcoming dependencies?" The small circle has intense contact with its own Scrum team throughout the Sprint via phone, WebEx, Jira, etc., and also face-to-face at the Sprint review. The large circle has face-to-face contact with all the Scrum teams in the product program every six weeks. This has worked quite well for us.
3. Continuous-Flow Acceptance Decoupled From Sprint Reviews
This is another solution suggested by Preeti Gholap!
This is a way I learned from a team I currently coach who is working on a complex COTS app. They were doing this even before officially becoming Agile. This team ensures they get detailed feedback and acceptance on a per story basis as soon as it’s done.
Since entering the Scrum and Agile coaching program at the company we work at currently, they have been learning to make stories smaller, more vertical, etc., and consequently, they are getting good at getting valuable stories to done every few days. This means that they are in the lovely position of having acceptance on a per story basis several times during the Sprint. This is even without any kind of automation.
They are able to do this in part because their end-users are close at hand, and their Product Owner trusts that team and end-users can collaborate nicely without her (the Product Owner) needing to preside over a “demo.” (The main reason they can do this is that they have always embraced getting things done, as well as face-to-face collaboration).
I actually have a hard time convincing this team to do a Sprint review (“What's the point? The work is accepted already.”). However, neither they nor the organization is quite ready for a Kanban/continuous flow way of working. Slowly, they are seeing that a short Sprint review is still handy for the reasons beyond feedback on the Sprint result — for example, engaging with wider stakeholders concerns for visibility and appreciation for the team members themselves, and the “feed-forward,” or reconfirming the roadmap with stakeholders holding diverse agendas. This technique of decoupling detailed feedback from the Sprint review could also help your scaled teams.
In this post, I’ve shared the challenges we currently face within Amsterdam Airport Schiphol with the Sprint review and how to organize a Sprint Review with an increasing amount of teams and stakeholders and preserve a valuable outcome. Thanks to the ideas suggested by Preeti Gholap; I’ve described some experiments in which we might try to deal with them.
If you’ve got any other suggestions that might be useful to experiment with, please share them! It’s highly appreciated!