By David Sprinkle
A recent client analysis taught columnist David Sprinkle that marketers must trust the data, not necessarily their instincts!
I’ve done a bunch of pretty technical posts in the past few months, so today I thought I’d go in a different direction. My team recently put together an interesting piece of analysis for one of our clients in the hospitality industry, and while the lessons learned are pretty specific to them (and confidential in any case), I think the results reinforced a number of core lessons about analysis that are relevant to any organization that’s trying to use digital data to improve their business.
Our client recently added a widget to their site that provides more-or-less real-time customer reviews from a popular vertical-related reviews site. Each review was a link, so visitors to the site could click through to the reviews site. The question we were posed was, might providing these off-site links be a bad idea? Once prospective customers clicked through to the reviews site, were they getting distracted or lured away by our client’s competitors and never returning to buy our client’s product?
I thought this was an interesting question. The way we approached the analysis was to compare the purchase behavior of visitors who clicked on those links to the purchase behavior of visitors who did not. We looked at a seven-day and 30-day window (using sequential segmentation) to see if they eventually returned and bought or not. I, and most of my team, expected to see at least some drop-off in the conversion rate—the question was how much, and whether the benefit of those on-page reviews for visitors who saw the reviews but didn’t click on them might offset that overall drop in conversion rate.
What we found instead was an extremely pronounced trend in the opposite direction: visitors who had clicked on those reviews — while a small percentage of the overall visitors — were far more likely to come back and buy within both the seven-day and 30-day timeframes. They were also far more loyal customers according to a number of related metrics, like the number of visits to the site, average order value, and number of repeat purchases.
So to make a long story shorter, the results were the complete opposite of what we expected! And I think there are a few key lessons this analysis illustrated, which are worth keeping in mind for any kind of analysis that shows dramatic results.[pullquote cite=”David Sprinkle” type=”right”]When feasible, test ideas and see whether your hypotheses hold up or not. You never know when an unexpected result will turn up that fundamentally changes the way you think about your business and your customers.[/pullquote]Lesson One: Correlation Does Not Imply Causation!
We expected to find that action A (clicking through to the reviews site) caused a drop in B (the conversion rate). In fact, action A proved to be a good indicator of an increase in B. But this doesn’t mean it caused it.
Based on our results, a literal-minded person might conclude that if they redirected ALL of their visitors to the reviews site, it would cause a huge jump in their conversion rate. But that’s obviously ridiculous. This brings me to:
Lesson Two: Understand Your Purchase Cycle
What we concluded was happening here is that visitors who clicked on the reviews are people who are further along in their decision cycle, and more serious about committing to buying our client’s product. In a sense, checking out the reviews was a way of doing due diligence and reinforcing a decision they had already largely decided on.
So while we originally thought we might be losing some conversions — and maybe, at the margins, we are — what we’re really seeing is a strong indicator of visitor intent to eventually buy. Our customers aren’t stupid, and they are most likely checking out third-party review sites and our competitors whether we make it easy for them or not! It’s much better to have a strong product and market it well than to pretend we can trap people on our website and trick them into completing our conversion funnel.
Lesson Three: So What?
This analysis was one of those things that makes you go “hmm.” Wow, an unexpected result. That’s interesting. But…so what? What action does this prompt us to take? If you’re not taking action on the results of your analysis, there’s really no point in doing it.
One obvious takeaway was that we shouldn’t remove the widget. Simple! But what else can we learn from this? Are there opportunities to tease out new insights about our target audience personas? Could we use these indicators of intent to target these highly engaged visitors with personalized content, remarketing, or promotions? Does answering this question spur us to ask new questions about whether we could slice and dice this audience even more finely, or look for other signals from the remaining visitors so we can target them better?
Certainly, this also drives home the importance of good reviews (and a strong product!) — maybe we need to check how we look on the third-party review site and make sure our business profile is as good as we can (ethically) make it. My key point here is, it’s especially easy when you deliver an unexpected result for everyone to say, “Wow, I learned something today” and then not act on it. Don’t let it happen to you!
Lesson Four: Trust the Data, Not Your Instincts
Virtually everyone involved in this project expected to see the same results. If I’d been the guy in charge of giving the green light to adding that widget, I would have probably been very cautious. I have to admit, there have been situations where someone came up with an idea that I shot down because I just KNEW it wouldn’t work.
Hey, I’m an expert at this whole site optimization thing, and even I can be dead wrong. So instead of smothering ideas at their birth, keep an open mind. When feasible, test those ideas and see whether your hypotheses hold up or not. You never know when an unexpected result will turn up that fundamentally changes the way you think about your business and your customers.
I hope this has been informative despite the necessary haziness around the details. If you have questions about the methodology or other thoughts on key lessons learned through analysis, leave comments!
An expert on analytics architecture and integration, David specializes in the innovative design and implementation of analytics solutions that deliver both global “big picture” insights and detailed performance metrics. David leads Acronym’s Analytics Practice as well as its Adobe Preferred Partnership, wherein Adobe subcontracts work to David’s team.
David also has extensive experience working with major analytics, bid management and reporting platforms, and is noted for his expertise in integrating such solutions into companies’ larger marketing and business infrastructures. David is a Certified Omniture Professional and a veteran industry speaker. His Client portfolio includes such leading brands as Four Seasons Hotels and Resorts, SAP, The Tribune Company, HP, Scholastic and Humana, among others.