Everything You Need to Measure Your Team's Agile Marketing Performance

by Andrea Fryrear
, 8 min read

This year, our special guest star on all things Agile marketing, Andrea Fryrear, will be providing Agile marketing newbies with a monthly step-by-step guide to their first year as an Agile marketer. This post is the tenth in the series. Enjoy!


As the sophistication of marketing technology continues to grow, marketers are becoming ever more adept at measuring our work. We get insight into content performance on a piece-by-piece basis and track a prospect’s behavior from interest to purchase to advocacy.

Even social media channels, long a suspicious black box of “brand building,” have had their inner workings revealed by the bright light of marketing metrics. 


See "4 KPIs Every Creative Team Should Track" for tips on deciding what your marketing team should be measuring.


But what about Agile marketing?

Do we leave behind concerns about accountability and ROI when we embrace continuous flow or iterative delivery? Certainly not.

Just like their traditional counterparts, Agile marketing teams need a system for measuring their performance.

Unlike traditional marketing teams, however, Agile marketers concern themselves not just with the results of what they deliver, but also the way in which it was delivered.

Agile metrics can give us insight into how the process is working so we can clearly see areas for improvement and intervene before small hiccups become major roadblocks.

So in the final installment of our year of Agile marketing, we’re investigating the metrics that make high performance possible.

A Word About KPIs and ROI

Before we jump into exploring ways to measure Agile teams specifically, it’s important to remind ourselves that Agile marketing projects are still beholden to good old fashioned measurements of marketing success too.

Business metrics, KPIs (key project indicators), and basic proof of return on investment (ROI) still matter in an Agile environment, so we have to keep an eye on those.

These data points are particularly important during an Agile marketing pilot, when you’re looking to prove the impact of a serious change, namely the switch to managing work using Agile principles.

If you haven’t previously been tracking basic performance metrics for your marketing, it’s going to be tough to quantify the effects of Agile marketing.

So even before you choose the right Agile metrics, make sure you’ve laid the foundation for measuring your marketing.

Agile Marketing Metrics

Now that we’ve covered the basics, it’s time to get into metrics for Agile marketing teams.

We’re going to cover five different options, most of which will apply to just about any kind of team. For those that are best suited to one particular methodology, we’ll be sure to note that during the explanation.

When choosing which of these to track, remember that it’s difficult to optimize something you don’t measure. But you also don’t want to waste hours of time on pulling numbers.

Consider what you’ll do with the data you collect, and whether your current tool set provides you easy access to the information you’re after. If you don’t plan to act on your metric, or it’s complicated to collect, you may want to try another.

Velocity of an Agile Marketing Team

agile marketing quote andrea fryrear

Velocity simply measures the amount of work that a team completes during a set amount of time, usually a sprint or iteration. It’s usually gauged in points, which means it requires the team to size each piece of work that they put into their sprint backlog.

Any Agile team that uses timeboxed iterations can measure their velocity. This means both Scrum and Scrumban teams can easily make use of this metric, while it doesn’t often work very well in a Kanban implementation.

At its core, the goal of measuring velocity is to see whether the team is improving over time, but it can be misused.

If velocity becomes a bludgeon with which to beat the team, they’re likely to start gaming their estimations so they appear to be doing more even if they aren’t. Make sure velocity is simply a health check for the team, not an unreasonable expectation.

Image source: Atlassian.com.

As you track velocity, look for any erratic moments. If these become the norm rather than the exception, help the team investigate the root cause. Some possible retrospective questions might include:

  • Are there unforeseen challenges we didn't account for when estimating this work? How can we better break down work to uncover some of these challenges?
  • Is there outside business pressure pushing the team beyond its limits? Is adherence to marketing best practices suffering as a result?
  • As a team, are we overzealous in forecasting for the sprint?

Burndown Chart

Velocity is a trailing indicator—it won’t give you any data about an iteration until it’s already over. A burndown chart, on the other hand, gives you day-to-day data about how the team is doing throughout the sprint.

Like velocity, an effective burndown chart requires estimation of the projects being used.

Here the team has committed to 160 points, and as time goes on and they complete some of their work, the chart “burns down” until the sprint is over. At that point they’ll hopefully have zero work remaining and the vertical bars will have disappeared.

Burndown charts are highly useful on young Agile teams who need some guidance on how much they can really get done within each sprint. They can tell us if:

  • The team isn’t committing to enough work because their burndown chart “burns down” early sprint after sprint.
  • The team is committing to too much because they have multiple points left on their chart at the end of each sprint.
  • The team isn’t breaking work down into small enough pieces because the burndown line makes steep jumps rather than a gradual decline.
  • External interruptions are derailing the team because the vertical line representing points goes up mid-sprint as new work gets added.

Once a team stabilizes and delivers a steady amount of work the burndown chart may become less useful. Of course, it’s hard/impossible to get these kinds of numbers retroactively, so keeping them on the radar can help surface problems before they get out of hand.

Quarterly or Strategic Burndown Chart

Burndown charts provide insight into how things are going during a particular iteration, but we also need to make sure we’re incrementally moving towards bigger goals.

On a traditional Agile software team, this trajectory could be tracked via an epic or release burndown chart; on an Agile marketing team, we might prefer to call it a strategic or quarterly burndown chart.

This is a longer term look at what the team is doing and how it’s related to the marketing strategy over time.

Image source: Atlassian.com.

You can see in the chart above that a few new items have been added to the team’s larger backlog after each sprint. While we wouldn’t allow that to happen to a sprint backlog in the middle of an iteration, it’s completely acceptable in this context.

Over time managers and leaders may decide to add or remove some items based on what they learn, or based on changing organizational objectives. As that happens the strategic burndown keeps everyone aware of “the ebb and flow of work” within the long term strategy.

Monitoring the strategic burndown chart can give you insight into whether the team’s work is actively contributing to larger goals, or if they’re just churning through busy work.

Red flags to watch for on this chart include:

  • Strategic forecasts aren't updated as the team delivers work.
  • No change is visible after several iterations.
  • Chronic scope creep, which may be a sign that marketing leaders don’t fully understand the problem the Agile marketing team is trying to solve.
  • Strategic scope grows faster than the team can absorb it, meaning they’ll never be able to complete a strategic goal.

Cycle Time, or “Hands on Keyboard” Time

So far our metrics have been mostly concerned with work at the team level, but we may also want to know more about how long it takes the team to finish smaller projects. Cycle time can deliver this insight.

Cycle time simply measures how long it takes something to get done from start to finish. Joel Bancroft-Connors of AgileConnection calls it “hands on keyboard time.”

Measuring cycle time is best done automatically via your Agile tool of choice, but even measuring manually alongside a physical task board will give you useful data.

Just keep track of how long a project/task/story takes to get into your “done” column once it’s pulled from the backlog. The average for all your work is your cycle time.

It’s measured at the level of a user story (or a task or project, depending on how you structure your Agile board), so it’s a much more granular metric than anything we’d see on a burndown chart.

Cycle time also gets us much more immediate feedback, because we can see the results of any changes right away.

When we adjust the system, cycle time will either increase or decrease right away, so we know if the experiment succeeded or failed in a short time. Our goal is cycle time that’s both consistent and short, regardless of the kind of work being done.

Consistent cycle time means we can accurately predict when we’ll be able to deliver individual pieces of work, whether we’re using continuous flow or sprint-like timeboxes to handle larger projects.

Agile Team Happiness

agile marketing andrea fryrear

Our final metric isn’t strictly related to the Agile process at all. Instead, it’s a look at the team’s happiness.

There’s no need for any fancy tracking system here. Just start your retrospectives by asking each member of the Agile team to score their happiness on whatever scale you choose. Then keep track of the average from one retrospective to the next.

Happy teams create better marketing, which will deliver more satisfied customers, so this is in fact an important business metric.

But it’s also important to recognize that if all your process metrics are perfect but the team is deeply unhappy, you’re headed for some trouble. The team may be masking its issues while maintaining high performance, meaning burnout and high turnover could be on the horizon.

New team members invariably disrupt an Agile marketing unit, so we want to keep people happy and the team configuration stable whenever possible.

How Long to Track Agile Metrics

You may decide to track one or all of the metrics we’ve discussed so far, but whatever combination you select make sure you get more than a brief snapshot.

Track metrics for at least six months to identify larger trends.

That doesn’t mean you can’t take any action on your findings until six months have passed (that wouldn’t be very Agile). But this ongoing look at different data points can help you see how one impacts the others.

For example, take a look at these two graphs illustrating the planned to done ratio for a team and its happiness:

Image source: Atlassian.com.

The relationship between productivity and morale is obvious, and it’s likely this team has a tough retrospective coming up. But without the long-term holistic view of the team and its performance this interplay wouldn’t be clear.

Metric Pitfalls to Avoid

Before you start measuring your Agile marketing teams, I want to leave you with two warnings:

  1. Keep metrics within teams. Neither teams nor managers should compare metrics across teams, and that goes for everything from velocity to happiness. These metrics are for individual teams only.
  2. Measure, don’t target. When a measure becomes a target, it ceases to be a good measure. The goal of Agile metrics is to spot and understand trends, not to create magic numbers the team tries to hit at all costs.

    To see the first post in our first year of Agile marketing with Andrea Fryrear series, click here.

Get Blog updates straight to your inbox

Resources to gain a competitive edge

Agile Marketing Cheat Sheet

Learning and adopting the Agile Marketing methodology often requires a change in thinking, team structure, and even vocabulary. With so much new...
by Workfront Admin

What are you waiting for? Get Your free demo