Using metrics in your Retrospectives



Stage 4

Agile IQ® Level






Many teams leap to make changes without understanding whether their improvement actions have any real impact.


But, we collaborated well!

It's not enough to say "we collaborated well" in the Retrospective. Using metrics can uncover what led to great collaboration and when it happened, so you can repeat it next Sprint.

Metrics in retrospectives

Gathering data is an important part of advanced Retrospectives. Important data to examine include:

  • Quality, defects and re-work
  • Velocity trends
  • Accuracy of estimations – was the “S” really a “S” or did it take as long as a “L”?
  • Lead time and cycle time
  • Agile IQ behaviours – self organisation, agile values, Sprinting and culture of continuous learning


If the team’s quality is low, if there is a lot of rework, then tighten the Definition of Done. If the team is strict in its adherence to the Definition of Done then quality should improve and defects should reduce.


Velocity is the average amount of Product Backlog that can be turned into an Increment of Done. Velocity trends should improve as a team gets better at delivering the same type of work. Velocity can go down due to:

  • New team members being introduced to the team. It takes the team a while to learn how to work together with its new members.
  • Planned vacation time when there are fewer team members.
  • School holidays.
  • Team members on unplanned (sick) leave.
  • Public holidays.
  • New work that the team is capable of doing, but might not have delivered together before.

Use Retrospective to talk about velocity trends:

  • How much does our velocity go down when people are on unplanned leave?
  • What causes our velocity to remain high?
  • What causes our velocity to dip? Is it new work?
velocity trends team voltron
Above: Velocity goes up and down

How accurate are your estimations

How long do your “S” items take (in days) to go from in-progress to Done? What about “M” and “L” sized work?¬†

Cluster Backlog Items by the number of days it took from in-progress to Done. There may be no difference in delivery time between certain sizes like “S” and “M”.



Estimate the size of the work it will take to deliver the Product Backlog item to Done as a team. Don't estimate tasks and don't use hours or days for estimation.

Lead Time and Cycle Time

Lead Time is the total time it takes for a request to be actioned and delivered to Done.

Cycle Time is the time an item takes to move from in-progress to Done.

Cluster Backlog Items by the number of days it took from in-progress to Done. There may be no difference in delivery time between certain sizes like “S” and “M”.

etimation comparison of pbis with cycle time
Above: Are your estimations accurate?

Measuring Agile Behaviours

As agile behaviours strengthen, you should see a corresponding increase in the following Agile IQ measures:

  • Self-organisation
  • Agile Values
  • Sprinting
  • Continuous Learning Culture

Do a new team assessment every couple of months and decide as a team how you will improve these measures:

  • Can the team be more self-organising if they facilitate events themselves?
  • Is the team working at a sustainable pace? What would it take to help the team work at a sustainable pace instead of feeling frantic and overburdened all the time?
  • How can you improve feedback in Sprint Review?
  • How can you improve inspection and adaptation of Scrum artefacts like the Sprint Goal and Sprint Backlog?
  • Is there a new skill the team can learn that will help solve complex problems?


Do a team assessment every month. Compare the results to last assessment. And then look for improvement actions that align to your team's maturity stage.

Actions to try

Set the scene

Esther Derby’s Agile Retrospectives pattern sets a good structure for the use of metrics.

Set the Scene

Gather Data

Generate Insights

Decide What To Do

Close The Retro

Set the scene

Ask the team to talk about the improvement actions that they did this Sprint. Ask questions like:

  • What were the actions?
  • What did they expect to happen as a result of these tasks?
  • What metrics were supposed to improve?

Gather data

Examine the metrics you used:

  • Did the metrics change?
  • Which ones changed?
  • How much did they change by?

Generate insights

  • What metrics changed?
  • What metrics stayed the same?
  • Did they go up or down?
  • Did nothing happen?¬†

Decide next steps

  • Choose some new actions
  • Select a metric that will show the action is having the desired impact

Don't jump to actions too quickly

Many teams jump from "what worked" to making decisions without understanding the causes of great outcomes, or poor ones, in a Sprint. Unpack what happened and identify the root cause to work out how to make successes repeatable.

Things to watch out for

  • Be careful not to choose “vanity” metrics.
  • Don’t use velocity to compare teams.
  • Don’t choose too many metrics. Choose one or two that make sense for the improvement you’ve chosen.

Actions to Try

  • Identify patterns you wish to repeat in every retrospective, and which aspects you want to change on a regular basis.
  • Observe the Sprint and then decide on a pattern that will allow facilitation and discussion around understanding the events of the Sprint.
  • Look for ways to repeat successes and minimise problems that occur in the Sprint.
  • It’s not enough to say “we collaborated well” in the Retrospective. Uncover what led to great collaboration so you can repeat it next Sprint.

All fields are required.

Your user code appears in your user profile. It is a 12-digit key with spaces between each set of four characters.
Your Agile IQ® ID is your 12-digit subscription key.


agile iq academy logo 2022-05-05 sm

Enter your details

search previous next tag category expand menu location phone mail time cart zoom edit close