Sprint Reviews in Practice
First published in The Agile Zone, 23 May 2013
But to my mind, though I am native hereAnd to the manner born, it is a customMore honor'd in the breach than the observance - Hamlet Act 1, scene 4, 7–16
Let me tell you about a seat-of-the-pants agile maturity model you can apply to organizational transformations. It's simple: all you have to do is assess how many of the requisite practices are actually being observed. Scrum teams, for instance, can be assessed in this rough-and-ready way by looking at which of the Scrum events are genuinely happening. It's certainly a primitive measure but it relies on the fact that new ways of working don't always bed in evenly, and it's quite common for some techniques to be short-changed or elided altogether. In Scrum, you're likely to find that certain events will gain traction while others are a real struggle for teams to come to terms with. For example, most teams are able to organize a daily stand-up with at least a degree of consistency. The time at which the stand-up occurs might be vague in the the early days, and attendance might well be shaky, but in my experience I find that it usually happens. More often than not, there'll be enough for me to work with and to help the team improve.
Sprint Planning is another comparatively easy one. I don't mean that planning itself is easy...it isn't, it's bloody hard. What I mean is that it is fairly easy to make sure that a Sprint Planning event happens. At the beginning of each Sprint, teams can usually be made to assemble without too much of a struggle, and they'll make some sort of attempt to decide what to do in the next time-box. The resulting Sprint Goal might be pixie dust, and the Sprint Backlog nothing more than a cynical commentary on a lost cause, but the point is that it usually happens. Again, I find that something is generally there for me to build on so I can help the team improve.
Sprint Reviews are rather more problematical. The teams will certainly know they are meant to do them, because I make it quite clear from the beginning that ditching reviews is not an option. Yet I also know that if my back is turned, this event is prone to being tipped over the side in the heat of battle, while stand-ups and planning sessions are a bit less likely to suffer this treatment. So what is it about reviews that leads to their devaluation by teams that are new to agile practice? First of all let's consider what a Sprint Review is meant to be in the first place, and then we can try to figure why it is more honored in the breach than in the observance.
What is actually being reviewed?
At the beginning of a Sprint - during planning - a team will undertake to deliver something of value to the Product Owner. A Sprint Goal will be agreed between them, and they will negotiate to deliver a portion of the product requirements which is most relevant to the goal by the end of the Sprint. The entire set of product requirements is represented as an ordered list called the Product Backlog. The Sprint Backlog is the portion that the team will have agreed to action.
Each requirement must have clear acceptance criteria, and in addition the team must have a clear understanding of what it means for work to be done. Taken together, the Sprint Backlog and the Sprint Goal represent a forecast of the work that will be done in the Sprint. Taken together, the clear definition of done and the acceptance criteria will keep the risk of having to do any rework as small as possible. The team will plan to deliver the work so it meets their definition of done and also the specific acceptance criteria that have been set for each requirement.
At the end of each Sprint the team will meet with the Product Owner and look back upon what has actually been delivered in accordance with those terms. A demonstration of the deliverables is often the highlight of a review. This is an opportunity to show that the work has indeed been done, that the acceptance criteria have all been satisfied, and that the Sprint Goal has been met. It is an opportunity to satisfy the Product Owner that value is being provided, or to adapt the Product Backlog - or reconsider the project - if it is not.
Confusing Reviews with Retrospectives
The review is meant to look at what has been produced and still remains to be done. It isn't meant to address the method of how work is being done. The how is the topic of the Sprint Retrospective - an "inspect and adapt" opportunity for the team's implementation of Scrum itself.
It's very common for new teams to slur reviews and retrospectives together, and quite a few long-standing teams also make this error. It's another indicator of how far along a team genuinely is in its agile transition. While it's certainly possible to run a review and a retrospective back-to-back, the two events should nonetheless be kept quite separate and distinct from each other. I generally recommend doing the review before the retrospective, because the former can produce ideas for consideration in the latter.
A mature team will be able to think about the work they do in isolation from their implementation of the Scrum process, and vice-versa. For example, in a retrospective they'll be able to reflect on how agile methods were used on other projects and how those lessons can be brought to bear on the current one. They'll keep the review focused on the work at hand. The Sprint Review will focus on what was done and remains to be done, but won't spill over into how work is done.
How is a Sprint Review conducted?
Here's the format of how a typical Sprint Review will be conducted. Reviews have the potential to be a bit dry, so the imperative is to keep them brisk and as enjoyable as possible. The Scrum Master is the guardian of the Scrum process and it is his or her responsibility to make sure that the review is carried out properly. A review should be time-boxed to no more than 4 hours for a one-month Sprint, or 2 hours if the Sprint lasts 2 weeks.
A demonstration of work done will be conducted, ideally by the Product Owner or whoever it is that wanted the work doing. For example, if an end user can demonstrate the deliverable in use rather than a developer, this will provide good evidence that collaboration has happened and that the increment is fit for purpose and is potentially releasable.
The Product Owner will identify the work on the Sprint Backlog that has been done satisfactorily in accordance with the Sprint Goal, and what work (if any) has not been done satisfactorily.
The team will discuss what went well during the Sprint and the impediments they encountered. This is the point where immature teams often start to blur a review with a retrospective. Remember, it's important to keep the review focused on what work was done, not how it is being done. For example, a mature team will pin down the reasons for any work not being done in terms of specific occurrences of waste, or of unplanned emergency work. The lessons to be learned from that will be considered in the Sprint Retrospective.
The Product Owner will explain the work that remains in the Product Backlog. If any work wasn't done satisfactorily during the Sprint it should be returned to the Product Backlog where it can be reprioritized.
The team will groom the Product Backlog so that the next Sprint Planning session can occur without impediment. The requirements that are likely to be introduced into the upcoming Sprint Backlog will be checked for completeness, including their acceptance criteria.
Backlog grooming as a separate event
A team should never be surprised by the work the Product Owner wants them to do. It's important that the team keeps sight of the Product Backlog as requirements change, so they don't go into a planning session and find themselves dumbfounded by the work they are being asked to deliver. This review of the backlog can certainly be done in the Sprint Review, but since the review happens at the end of a Sprint, and planning happens at the start of the next, there's not much opportunity to resolve any issues. A strong case can be made for having at least one separate backlog grooming activity during the Sprint.
Backlog grooming is an opportunity for the team to see what they are likely to be asked to do in the next Sprint, to highlight any concerns, and to make recommendations. Ideally it will happen before the Review so enough time is allowed for problems to be ironed out. A team may organise multiple backlog grooming sessions if it helps planning to go smoothly. These sessions are also an opportunity for the Product Owner to decide on a goal for the next Sprint, to determine the priority of each item in the Product Backlog, and to discuss any concerns about feasibility or implementation with the team in advance of planning. The team should also be prepared to give tentative estimates following discussion of this work.
Sprint Review Antipatterns: what shouldn't happen
We're beginning to see why a Sprint Review is easily knocked off the agenda. There are many things that can go wrong or which are subject to misunderstanding, and which will make the perceived value of reviews unclear, or even undesirable. Let's summarize these problems so we can conclude why they are so often missed, and then see what we can do to increase their chances.
Common problems with reviews:
A Product Owner or other stakeholder expects work that wasn't planned in
A Product Owner or other stakeholder misreads estimates as commitments
A team is held to an unrevised Sprint Goal when the Sprint Backlog was extensively revised
A team is held to an unrevised Sprint Goal when they had to do unplanned emergency work
Seeing the review as an opportunity to blame instead of to learn
Confusing the review with a retrospective
Why Reviews are often missed
Well, it's clear that there's no great mystery about this. Inexperienced teams will often try to short-change reviews for three basic reasons:
Firstly reviews happen at the end of a Sprint. This is when new teams will often be in a mad rush to deliver on their commitments. Remember, a nice even burn-down throughout a Sprint is a sign of agile maturity. Newbie teams typically won't be there yet, so you can expect an intense burst of activity over the last day or two. The rush to deliver is given priority over reflective practices that would assist delivery in the longer term.
Secondly, a review of the Sprint deliverables will show what hasn't been done. A new team might not have the confidence to deal with this in an agile way, and to retrospectively inspect their process and adapt it accordingly. They often don't want a review to expose an apparent "failure" on their part, and to incur the criticism that it might bring. They'd rather just keep quiet about their issues and roll unfinished work into the next Sprint on the sly, or sweep it under the carpet of technical debt.
Thirdly, a newbie team might not have even prepared for the review. It takes discipline to work on backlog items in a carefully paced and measured way, and to allow enough time to put together a professional review that inspires stakeholder confidence. Hence the review becomes one of the first things to be sacrificed as an immature team approaches the Sprint end-date, and undone work screams for their attention from the task board.
However, there's another reason why Sprint Reviews are often missed: the Product Owner may not care enough to attend. It sounds like a strange thing to say, but the fact is that many Product Owners are assigned to the role without even being genuine stakeholders in the first place. They get the job because of related domain knowledge or because they have occupied a management role in the past. This is an antipattern of product ownership and it raises questions about whether or not a project should go ahead if its interests cannot be represented properly.
How to make Sprint Reviews happen
Here's what you can do to increase the chances of a Sprint Review taking place and of being conducted in a professional manner.
Have a Product Owner who cares about the product. Without effective business representation the value of any deliverables brought to the table will not be brought into focus.
Have a good facilitator, such as a Scrum Master who can ensure that the review process is followed and that it does not head off into the weeds.
Make sure that enough time is allowed to prepare for the review during the Sprint, including any demonstrations that are best done by end users.
Get approval of completed work on an ongoing basis throughout the Sprint. Don't wait until the Sprint Review before you get sign-off. There is no sense in deferring all of the risk to the end.
Keep the review as short as possible. Just because it is time-boxed to a certain number of hours doesn't mean that you have to take up all of them. As well as getting work approved in advance, you can conduct at least some backlog grooming earlier, when it might even be more effective.
Have fun! Bring cookies, drinks, and other refreshments. Consider interspersing the review with short activities or breaks for games. Pub-quiz type questionnaires are popular and I've even known small prizes to be given to the winners.