March 29, 2008


You are the web analytics expert at your company. On any given day, you expect 4% of your visitors to purchase merchandise.

Yesterday, your reporting suggests that the conversion rate was 6%. You wouldn't expect that to happen, the outcome is not what is normally observed.

If you shared this finding with leadership, what would their response be?
  • Something is wrong. Did you screw up? Is something wrong with our reporting system? Go find out what went wrong and fix the problem.
  • This is great! I accept the results.
  • This is great! No go find out why this happened, can we do it again?
Of course, some context is required. If the employee routinely makes mistakes, if reporting is frequently wrong, or if the employee has an agenda to hype, the response from leadership is predictable.

But if the employee is trustworthy, and the source of the data is trustworthy, what response would you typically hear from your leadership team?

I am continually surprised that we, as honest, hard working individuals, error on the side of something being wrong. If findings don't match our worldview, or don't fall into a pre-defined set of best practices established by industry veterans, we humans are often more likely to reject the findings than to accept them.

We look for all the reasons the findings could be wrong. When we exhaust those reasons, we demand that the findings be replicated via controlled tests. We advise folks not to listen to the individual who produced the unusual findings until the results can be validated. If the results are threatening to an industry, we'll discredit the individual who announced the findings, or block the access that individual might have to audiences who might want to hear about the findings.

The older the industry is, the more likely we are to hold on to what we knew to be true. Maybe we need to do the opposite, to openly consider other options. Maybe we can act before we have proof.


  1. Not sure that I wouldn't ask the analyst to check for other data anomalies - just to make sure I wasn't, say, missing data from a page that might tend to capture "hits" from customers who don't purchase. Were my visits down? Were they up? Do I see duplicate transaction records? Do the dollar amounts I see match complementary records from finance?

    The reason to ask these questions isn't that the analyst is an idiot. It's that online measurements systems generate data a lot of data on visitors, and there are so many ways for that data to be wrong...and so few indicators that data is wrong OTHER than abnormal volumes or conversion rates, or customer pathing.

    Also, there may be some obvious reason why my site conversion rate would spike - from statistical noise to external factors..arrival of that "economic boost" check...some free press...or maybe that free trial offer on the homepage. Most of these factors can be accounted for...but they should be considered when reviewing the way.

    And I'd make sure I understood the why - or had a plausible explanation for it - before I ran to the VPs with it.

  2. Sure, that all works. Good suggestions.


Note: Only a member of this blog may post a comment.

An Example of Hopping on to Chat With Your Community

Do you have a community? Do you have one off-platform? Here's an example of a Sennheiser product manager hopping on Reddit to chat with ...