This approach to retrospectives a five-part framework that I use for 90% of the retrospectives I lead. But that doesn’t mean that every retrospective I lead is exactly the same. Each is unique, and tailored to the issues the team is facing, the data they need to understand the issues, and the character of the group.
Set the Stage
Lay the groundwork for the session by reviewing the focus and agenda. The focus depends on what’s happening for the team. If the team is struggling with engineering practices, focus on that. If they’ve got the technical practices down pat, but they’re struggling with customer interactions, make that the focus. If there isn’t a particular issue that stands out, focus on continuous improvement or building on strengths. In any case, choose one area to to explore during the retrospective. You don’t have all day, and you can’t cover everything. Picking one area means that you can delve beneath the surface on that one topic, rather than skimming the surface on several. Having a focus prevents Brownian motion.
No one enjoys swirling conversations, and having an agenda shows that you have a plan to manage the meeting and reach a conclusion. Create an environment for participation by checking in and establishing working agreements (if you don’t already have them). Some people feel this preparation isn’t real work, but believe me—when you skip this part of a retrospective, you may save a little time, but you’ll pay for it later in the retrospective. Skip this part and people are less trusting and less likely to participate.
Activities to consider: Check-in, working agreements, Focus on Focus Off
Before the team can identify possible improvements, they need to understand the problem. To do that, you need data. Exactly what data depends on the focus of the particular retrospective. For example, if the focus is meeting sprint goals, it would probably be helpful to have data about interruptions, additional work, stories that ballooned in size, technical spikes, system down time, and so forth. In addition to hard facts, look at subjective information about how the team experienced the iteration.
The goal of gathering data is to ground the teams analysis in facts and to create a shared picture of the iteration. When the group members see the iteration from many points of view, they’ll have greater insight and will be more likely to move beyond their personal views to the see the big picture of how the team works.
Activities to consider: Timelines, radar charts, Frequency/Impact grid
When people think together from shared data, they are more likely to arrive at ideas that the whole group supports.
Step back, and look at the data and shared picture the team has gathered. Then, analyze, looking for patterns, root causes, and insights about the issues. I choose activities and analysis techniques that are appropriate for the data. These tools help the team get beyond habitual thinking.
Activities to consider: Cause and effect diagrams, Fishbone diagrams, Learning Matrix
Decide What to Do
Prioritize the team’s insights and choose one or two improvements or experiments that will make a difference for the team. Just as when the team commits to stories during sprint planning, they need to know something about the value of an option, and the size of the effort. It doesn’t have to be a precise estimate, just good enough for the team to gauge whether it will be valuable and whether they can incorporate it into the iteration. For one- or two-week iterations, limit the number of actions to one. Choosing more is a set up for disappointment. And don’t ask the team which action is most important. Ask what they have energy to work on. Actions that are important are often also big, overwhelming, and difficult. Not so motivating, considering that the teams has feature stories to work on, too. Ask what they have energy to work on. If there’s no energy, it won’t get done.
Be sure to identify concrete, small steps that the team can take in the next iteration—one colleague calls them “now actions.” Make sure that the action steps are folded into the next sprint backlog, not kept to the side in an “improvement plan.” When improvement is part of the regular plan, it’s seen as a normal part of work. When improvement work is in a separate plan, it doesn’t get done.
Activities to consider: Impact, Effort, Energy grid, dot voting, fist of five
Close the Retrospective
Summarize how the team will follow up on plans and commitments. Thank people for their hard work, and do a little retrospective on the retrospective so you can improve your retrospective process, too.
Activities to consider: Return on Time Invested (ROTI), Helped/Hindered/Hypothesis, Appreciations
Find the Time
This may look like a lot to cover in one meeting, but it needn’t take a long time. For a two-week iteration, you can cover these steps in an hour or so. For longer iterations (say 4 weeks), dedicate a half-day to deciding what to do better next time.
It’s tempting to leave out a part to save time. Don’t do it. Leaving out one part to save time almost always has the opposite effect. Leave out Set the Stage, and the discussion will be all over the map and difficult to focus. Leave out Gathering Data, and each team member is working from his own observations (which may be limited) and his own opinions (which may be incorrect). Leave out Generating Insights, and you end up with the same tired explanations. Leave out Decide What to Do, and really, what’s the point? Closing ties it all up with a bow, brings closure and sends the team off to do what comes next. Short changing your retrospective means short changing your chance to do better next time. Improvement doesn’t happen by hoping; teams need to dedicate time thinking and learning.
Too many teams start their retrospective by asking the team to list their insights: “What did we do well, and what should we do differently?” That’s step three. Other teams simply list items to Start, Stop, and Continue. That’s asking the team to think of action items. It’s no surprise to me that these teams complain that they the same actions come up over and over, and they don’t see benefits in retrospectives.
I know of organizations that have “standardized” their retrospectives on “mad sad glad” or “stop start continue.” Those teams don’t get much out of their retrospectives, either. While I do use this framework most of the time, I vary the focus, data, and activities. If the team always does the same retrospective, they fall into a rut. And when the team is in a rut, they aren’t likely to have new ideas, fresh insights, or energy to do a dang thing. There’s no sense in doing a retrospective if its just going through the motions.
© 2010 Esther Derby