Are you sure your analytics mean what you think they mean?
I saw a post on LinkedIn the other day asking why B2B companies don’t talk to their customers.
They look at analytics, the post claimed, and they’d rather believe data from Google than talk to real people — or even listen to what they’re saying.
- Is this true?
- Is it unique to B2B?
- Do we trust data more than people?
- Is this an introvert vs. extrovert thing?
- Do we think data is more reliable than what real people say?
I’m imagining a conversation where someone says, “why don’t you ask your customers what brought them to your service,” to which the “data-driven marketer” replies, “What good is that? They won’t remember. I can get that number from my analytics.”
But can you?
Do marketers believe that cold, hard data from analytics is more valuable than people’s admittedly squishy and fallible memories?
There’s certainly some truth to that, but attribution is very difficult to prove no matter how you go about it, and technology solutions aren’t necessarily giving you the right answer.
As I thought about this it conjured up a few contrasts in my mind.
- Intuition vs. measurement
- Statistics vs. anecdotes
- Self-reported data vs. observed data
- Story vs. analysis
In what situations can data mislead if we don’t have the human context? Here are some possible examples.
We usually assume that a high bounce rate indicates a bad user experience. But what if users found exactly what they wanted on that one page? Maybe that’s exactly and all they needed.
Turning that on its head, people often assume that lots of page views per session imply healthy engagement, but it might mean people can’t find what they’re looking for.
If you measure the popularity of a page by its page views, it may be that lots of people are going to that page for some other, accidental reason, but nobody is satisfied with what they find on that page. (For example, maybe some quirk in your search results sends people to that page.)
While we’re on the topic of search, your search function may be used a lot, which may indicate it is a valued feature, but it may be a reflection of bad site organization.
Or let’s say you do a poll to rate “overall satisfaction” with your site, and you get a good rating. That’s great, but there might still be frustration with certain key elements or functions.
The point is that you might be telling the wrong story with your data if you don’t have the human context. You need to check your assumptions against actual customer experience to make sure you’re not missing something.
Links