6.0.0 (May 13, 2024)

Sprint Metrics

For Scrum teams operating in sprints, having the Commitment and Completed Work metrics is enough to calculate the velocity. However, the velocity number alone does not fully explain the story behind the sprint. You need to know if the team changed their Initial Commitment, by how much, and why—whether due to the work added during the sprint or an estimation error. Knowing how many story points were not completed is critical for evaluating the sprint's success. Sprint Metrics addresses these questions by showing total data in story points (time units or issue count) or relative to the commitment.

image-20240430-132907.png

Also, as a part of this new release, we are happy to introduce two new sprint metrics—Rollover and Total Scope change—and provide an update on the Completed work (initial) metric.

Sprint Rollover

When a team starts a sprint, it's common to have some planned issues started in the previous sprint (leftovers) and others that haven’t started yet (spillovers), which are then rolled over to the new sprint. It's typical for some issues to roll over from sprint to sprint, often getting lost in the workload.

To address this, we introduced the sprint Rollover metric. This metric sheds light on several critical aspects:

  1. How many story points were rolled over, affecting the team’s actual capacity for new work?

    image-20240430-133530.png

     

  2. How often did a particular issue roll over from one sprint to another?

     

  3. What proportion of the Initial Commitment was taken up by Rollover issues across a series of sprints? Is everyone satisfied with this? If not, what should be considered a healthy pattern?

     

  4. Were all incomplete issues transferred to the next sprint? Is everyone content with this?

By focusing on these questions, the sprint Rollover metric aims to provide clarity on how ongoing issues affect the project and assist teams in evaluating the effectiveness of their sprint planning.

Total Scope Change

The Estimation Change, Added Work, and Removed Work metrics contribute to the difference between the Initial Commitment and Final Commitment, giving a better understanding of why the scope of work changed. Recognizing this, we felt it was critical to introduce the Total Scope Change metric to address the following questions:

  1. By how many story points was the sprint scope affected?

     

  2. Which issues have led to these changes?

     

  3. What proportion of the Final Commitment was altered over a series of sprints? Is this an appropriate degree of change? What should be considered a healthy pattern?

The scope of work change metric is intended to provide a comprehensive overview of scope adjustments made during the sprint, allowing teams to better understand and manage changes to their project commitments.

Completed work (initial)

When reviewing the built-in velocity chart in Jira, if both the Initial Commitment and Completed figures are listed as 25 story points, it would imply a Say/Do ratio of 100%. However, it's possible that not all sprint goals were achieved. This could occur if issues were re-estimated or added during the sprint, affecting the final numbers on the chart. To address this, we introduced the "Completed Work (initial)" metric, which reflects the completion rate of the work initially planned.

Now, this metric works in two ways:

Without applying estimation changes to issues in Initial Commitment: This default mode provides a straightforward view of the percentage of initially planned work completed. It shows the success rate of the sprint goals, assuming that the Initial Commitment accurately represents them.

 

 

With applying estimation changes to issues in Initial Commitment: This mode adjusts the completion figures to include any changes in estimates during the sprint. It shows how many story points were completed from the initial planned scope, taking into account any changes to the initial estimates.

 

 

These modes take into consideration changes in scope and estimating practices that occur during the sprint, which helps to give a more detailed understanding of how much progress was made.

Sprint Breakdown

Every issue in Jira comes with an extensive list of fields that provide additional context to the "work" described in the ticket. This added context stimulates our cognitive processes, enabling us to picture the abstract folders and boxes where all these issues are organized. To facilitate this process within Velocity Chart, we have added a Breakdown feature in the form of a table with collapsible sections. To view the sprint breakdown, click on sprint or interval:

*-if you need that feature in Burnup/Burndown charts, please let us know.

When broken down by issue parameters, the historical data shows the real picture of the selected interval on the chart, enriched with calculated metrics such as trend, average, and ratio (the latter is for Sprint Metrics only):

Users can manage metrics, chart type, and sprint breakdown configuration at the individual level. This means that their user settings only affect their view, and they do not need to change the settings for the chart. These adjustments are applied instantly:

 

 

When you select a segment in the breakdown table, you can view a filtered list of issues related to that segment. With this feature, you can use the chart to start a conversation, for example, during your retrospective meetings by bringing up a certain list of tickets:

Issue-level data

  1. The "Issue Type" in Jira allows for the breakdown of work types, identifying whether tasks are intended to deliver new user value, solve technical challenges, or address fixes.

  2. The "Status" represents the stage of the work at a specific time. The Initial Commitment captures the status at the start of the sprint; for other metrics, the status indicates where the work stood at the end of the sprint.

  3. "Priority," when used correctly, provides a perspective on the relative importance of the work. If not utilized effectively, all tasks may appear to have the same priority, making it difficult to analyze from this point of view.

  4. "Components" reflect the structural elements of a system, and in terms of delivery analytics, they can show where the focus is and the amount of effort allocated to different parts of a product or organization.

  5. "Labels" function similarly to components but are much more flexible. They are commonly used to represent business or organizational abstractions, allowing for more precise categorization and analysis of issues.

Org-level data

  1. In most cases, the board represents the team, making the team’s “scoreboard” accessible in charts that use Scrum or Kanban boards as data sources.

  2. Every issue is associated with a project. Therefore, when utilizing a Velocity Chart for a board that aggregates issues from multiple projects in Jira, you can view how many story points are allocated to each project.

  3. A breakdown by assignee provides individual insights into the delivery data, even when dealing with multiple teams.

  4. Epics can also span across multiple teams, allowing you to monitor the progress of these cross-team entities on a larger scale.

  5. Releases reflect a team's pulse, indicating how frequently it releases and how far it has progressed in the current release cycle. This insight gives an overview of the team's output rhythm and project milestones.

Benchmarking Chart

Just as you can't fully assess your health without matching your temperature with a 36.6 mark on the thermometer, you can't evaluate the delivery health of a team or across teams without analyzing delivery data and setting benchmarks. This is essential for measuring the outcomes of coaching or self-organization practices.

So we weren’t surprised by receiving numerous feature requests for completely new capabilities in detecting and monitoring healthy and unhealthy patterns, trends, and anomalies, as well as setting benchmarks and KPIs.

Fun fact: The word "benchmark" can be divided into two parts: "bench" and "mark." In this context, "bench" doesn't refer to a type of seating but rather to a flat surface or reference point used in measurements. "Mark" denotes the indicator or line made on that surface.

Our primary role as a reporting tool is to offer users a visual representation that helps them comprehend the data and serves as a starting point for meaningful conversations. That's why we've developed three new features for the Velocity charts and introduced a new Benchmarking chart equipped with an advanced statistics toolset.

Velocity Chart's New Capabilities

The median helps to avoid outliers when calculating the "average," making the benchmark more precise.

 

 

 

 

 

Relative target lines are an effective method for staying agile. They allow you to adjust your goals in relation to what you initially or finally committed to.

 

 

 

 

Setting the maximum chart height allows you to create dashboard gadgets with consistent heights, which makes visual pattern comparison easier.

 

 

 

 

Benchmarking chart

This chart is similar to the Velocity Chart for Scrum Boards, but with a few notable modifications:

It includes advanced benchmarking tools alongside sprint metrics.

  • All teams are represented on a single chart, either as bars or lines.

  • Percentage values are used by default to prevent misinterpretation of absolute values.

 

 

 

Benchmarking Tool Settings

  • The average is calculated as a simple mathematical average.

  • The 25th percentile, Median, and 75th percentile represent the top 25%, top 50%, and top 75% lines, respectively.

  • You have the option to display Deviation for one of the selected benchmarks. This will appear in the breakdown as spark bars, indicating the percentage of value deviation from the selected benchmark.

  • Users can manage the population sample by adding teams to the chart and selecting the number of intervals for calculation.

 

 

  • Adding a single team to the chart allows you to see how the "normal" Say/Do and other metric ratios vary in the team's pattern.

  • Adding 1+ teams to the chart enables viewing of metric patterns at the cross-team level.

  • The legend allows you to highlight or display any or all teams, which increases flexibility during live presentations.

  • The breakdown table includes benchmarks and deviations for the selected benchmark.