July 02, 2012

Why Do We Look At The Wrong Metrics?

Our laser-like focus on campaigns diverts us from measuring what really matters, doesn't it?

These days, everything we look at is campaign based ... and for good reason, because all we ever do anymore is manage campaigns.

Hundreds and hundreds of omnichannel campaigns!

We measure how each campaign performed.  We "optimize" each campaign, attempting to achieve the campaign nirvana!

Campaign optimization is a lot like trying to win at the stock market every single day.  You have an endless array of real-time information, you hone your trading skills, and yet, you lose money 49.8% of the days you participate.  Campaign optimization is no different.

How do I know this to be true?

Take a look at the table above.  For each of the past twelve years, this business measured customer loyalty metrics, and counted how many new customers the business acquired.

Look at annual repurchase rates among the twelve-month buyer file ... they barely change, do they?  This is typical.  We don't make customers "more loyal", even though we all have loyalty programs.  Don't believe me?  Run your own table.  Take a look at what your data suggests, for the past year.

Orders per buyer are fairly constant.

Items per order vary, in the opposite direction of price per item purchased.  This is also common ... when items become more expensive, the customer purchases fewer of them.  When items become less expensive, the customer purchases more of them.  If you don't believe this, run your own table, find out for yourself!

Value, defined as repurchase rate * orders per buyer per year * items per order * price per item is reasonably constant.  In twelve years, this metric bounces around by about 8%, but doesn't fundamentally move, does it?

Think of all of the campaigns executed over a decade, all designed to "optimize" customer performance ... all of the analytical and marketing anguish poured out over making sure that the "call to action" is exactly right.  Think of the shift in customer behavior between 2001 and 2012.

And yet, the customer behaves exactly the same.

Without a steady diet of new customers, this business is stuck, and will always be stuck.

Sort of like your business.

Unless your repurchase rates are 60% or greater, it is terribly hard to move customer loyalty metrics.  The business survives and thrives because new customers are acquired at rates greater than in prior years.

Or, the business survives and thrives because merchandise productivity is improved.  This can easily be measured through a comp segment analysis.

When we constantly look at response rates, open rates, click-through rates, conversion rates, landing page traffic, shopping cart abandonment, and a veritable plethora of campaign-based metrics, we miss the real story ... that everything we're measuring fails to describe actual, annual, customer behavior.

Why do we look at the wrong metrics?


  1. Hi Kevin,

    I completely get your point but the counter argument is that if most businesses need new customers then campaign optimization is the low hanging fruit. Its the stuff they can directly impact in real time.

    I've read your book and the methods are excellent. What I ended up using the method for in excel was channel analysis (rather than individual campaigns). Customers that we can define as having seen social media activities for instance versus customers that we could define as haven't over the course of a year helped us define whether the channel was effective. It allowed us to understand based on buying behavior that social worked where direct response rates showed it was a poor channel.

    The problem was that it was extremely laborious to pull the data together and not achievable at scale (it took months to define and get the data).

    So I think in answer to your question why do we look at the wrong metrics is because we can't get the data that makes your approach meaningful. Many businesses I've seen couldn't even pull the spreadsheet together that you show at the top of this page never mind break it down into a segmentation model that they can then learn from.

    I think for data mining to be truly valuable the information it produces has to feed an automated engine of communications across a lifecycle model. Channel analysis for instance or deep dives then become vital to that program. But until we get customers to that stage and get them the right tools I think we'll struggle to push them forward.

    Just my opinion and I'm sure you have some good examples that prove me wrong but ask yourself is it 10s of companies? hundreds or 10s of thousands that you know about that can currently employ the methods you talk about and turn it into a win for their business?

    I am not advocating short term thinking - I agree with you. Just trying to illustrate why I think the problem exists.


  2. @Steve, just out of interest; do you work agency or in-house?

    The problems you describe sound similar to the difficulties I have (I'm ashamed to say the parts of my analytics toolkit that I actually use has hardly improved in the years I've been reading this blog).

    I tend to blame these problems on the difficulty of accessing certain types of data for agency employees. If you're having the same trouble clientside then I'll have to find a different excuse (or step up my game :-))

  3. This ends up being an issue of aptitude.

    I have clients that do not use this style of analysis, but there is one person within the company that has the aptitude to do the work. When you show this person how to do this, the person gets busy, heads to the order entry system, pulls data, and makes it happen.

    If you use a tool like Google Analytics, you have a different set of skills ... highly valuable, no doubt, but not the kind of skills that allow you to dive into unorganized data.

    I find that almost every company I visit employs somebody who has the aptitude to do this kind of work. I find that companies with annual sales > $100,000,000 tend to do this kind of work.

  4. Anonymous9:50 AM


    I am in an agency. It's not getting access to the data it's getting an output at scale which I find the main issue. The data could tell you do xyz to drive more customers and if you can then simplify that to get it done at scale across the organisation then that's great.

    But in a $100M company where only a couple of people have the aptitude to do the work spreading the knowledge beyond that to different departments which you need to get to buy in to drive a real business impact is the real challenge. So you're talking about a big change management job to get the right metrics followed. In my opinion that's far harder than mining unorganized data (which as we all know isn't easy either). So there you have 2 reasons why the metrics aren't flying.

  5. We don't have to stop looking at data because management isn't following what is being sold.

    In 1999 at Eddie Bauer, my CEO and my VP entered my office, closed the door, and told me to no longer publish information like this. They told me it was dangerous that I was sharing such startling information, that bad news should only come from the top.

    Needless to say, I continued to share the information. I was not fired.

    You don't need to get buy in. You go get buy in.

    1. Easy to say 'you go get buy in.' Harder to do. You obviously had strong data but also clearly understood the politics/situation/consequences in Eddie Bauer and played them to your advantage. Or if you didn't you were very lucky.

      Either way that's a talent most data miners don't have.

      You asked why these metrics weren't being followed and I'm suggesting reasons. I am not disagreeing with your solutions either, nor am I suggesting that we all give up on the education effort. As Stephane said this is a great post.

      What I am also suggesting though is that there is another way that might alleviate the problem.

    2. I think most analysts have the skill, but it isn't a skill that is valued in the measurement community. The measurement community would rather badger people about the myriad ways that pie charts are abused.

      In 1992, I overheard my CEO tell my boss that I was a great analyst, but I had no business skills, and therefore, I wouldn't be considered for additional responsibility. So, I had a choice. I could continue to be an analyst, or I could change. I changed. It took nearly a decade of change to go from being an analyst to being a VP.

      We need leadership that helps analysts acquire these skills. You clearly have these skills, or you wouldn't articulate your points so well. Now we need to encourage others to broaden the metrics they use, and the communication skills required to sell the metrics and solutions. It can be done!

  6. Anonymous12:12 PM

    Great post Kevin (as always!)
    So the needle doesn't move despite all measurement and optimization efforts... Just playing devils advocate here, but isn't there a recession at some point? What if there had been no optimization efforts? Would this business still be alive? Just asking :)

  7. Sure, there would probably have been a decline in the performance of the customer file. No doubt.

    The problem is, of course, that your typical measurement expert isn't looking at enterprise-specific, annual metrics, and hasn't been given the tools to do so.

    I run into this all of the time. Online-specific campaign-based analytics look great, but fail to show that customer-focused annual metrics indicate that the total business (across all channels) has flat-lined.


Note: Only a member of this blog may post a comment.

Well, You Got Me Fired

I'd run what I now call a "Merchandise Dynamics" project for a brand. This brand was struggling, badly. When I looked at the d...