It might be the most asked question I get, and for good reason ...
Question: "Kevin, what the heck are you talking about when you say that you shouldn't test something that has a short half-life?"
First, let's define half-life ... go to Wikipedia for your answer!
Now let's view this from a testing standpoint. Maybe you want to test a big green "sign up now" button instead of a small blue "click here for more information" button. You run a test, and you learn that the big green button improves conversion rate by 34%.
Do you assume that the big green button will outperform the control by 34% forever?
If you are measuring "half-life", you are measuring the amount of time that it takes for the 34% lift to become a 17% ... or worse, to become a 0% lift.
So often, we test concepts that either have minimal half-life, or are only valid for the time/audience when they were tested. In other words, the concepts we are testing are fleeting in nature, they have no staying power.
There are concepts that have significant staying power. In apparel merchandising, you know that women buy merchandise for men, so you can improve conversion (online) or reduce expense (offline) by exclusively advertising womens merchandise. This has a half-life just shy of infinity, you'll be more profitable for the duration of your career by testing strategies that capitalize on this well-known fact. We don't read about these strategies and test findings, because the strategies are so profitable that they yield a significant competitive advantage to the folks who possess the knowledge.
Many concepts have minimal staying power, and, by definition, minimal half-life. We read about these strategies and test findings all of the time, because these strategies have a short half-life, rendering their competitive advantage only to the audience that saw the message during the timeframe when the test happened.
This is the phenomenon I refer to when I mention "half-life" in my writing. The best way to determine if you have a half-life issue is to re-test your strategy three months, six months, or twelve months later, observing if you still have a meaningful lift over your control group.
Here's an article that is making the rounds on Twitter this week (click here) . It's hard to know what you can "trust" on ...
It is time to find a few smart individuals in the world of e-mail analytics and data mining! And honestly, what follows is a dataset that y...
I always face a challenge from marketers when I talk about implementing a Welcome Program. When I tell marketers that a Welcome Program gene...
On Twitter you find all sorts of odd and untested ideas. One follower told me that he outsources all of his creative imagery to his c...