Enter a search term such as “mobile analytics” or browse our content using the filters above.
That’s not only a poor Scrabble score but we also couldn’t find any results matching
Check your spelling or try broadening your search.
Sorry about this, there is a problem with our search at the moment.
Please try again later.
Marketers are adrift in a sea of data. Do we need a bigger boat? Some of the most common online data pitfalls are easy to identify, but hard to avoid. In Part One of this two-part series, we look at why 72.8% of surveys aren't valid -- and the phenomenon of testing against the wrong metric.
Data Trap #1: Most surveys are junk, but some do real damage.
Now that every company is also a publisher, the survey is a set piece in content marketing. Unfortunately most surveys aren’t valid or well designed, but these days marketers know to view surveys skeptically until they understand the source, methodology and motivation for the research.
That healthy skepticism tends to break down when a survey is internal; polls of customers and prospects that are used to explore their needs, possible features or shifts in strategy. Often these surveys are fielded specifically to support a favored business case within the company.
At issue is an outsized respect for numbers, combined with an inability to see anecdotes for what they are. Here’s how it often goes down:
A survey goes out. Response isn’t great, but a hundred or so responses are collected. Whoever is responsible starts slicing up the sample by the people who matter, perhaps by role, company size, current vs. prospective customers, and the like. The small groups of respondents in each of these slices aren’t large enough to be statistically valid, but when the figures are there in black and white they become impossible to discount. “Forty percent of our respondents want something” is powerful stuff...even if what we’re really saying is, "five people want this.”
Then there’s the power of the anecdote, that story from customer service, a blog comment, or email that resonates. Anecdotes are best used to humanize a statistical reality. “Joyce lost her health care in 2007 when she contracted external lung syndrome…etc, etc.” The danger is when there’s no corroborating evidence to back up the anecdote. It’s a part of our psychology to latch onto stories, especially if they agree with our beliefs. So, one comment becomes the rallying cry, and its impact has more to do with the status of its champion than with any sort of formal inquiry.
I watched this happen years ago at a company that was debating the length of online articles. Two camps emerged (long vs. short) and a survey was fielded to readers. What emerged was a split picture. When asked specifically about length, most people found the articles were too long. However, their answer to a question about overall value suggested a high level of article detail was an important benefit and differentiator. In other words, articles were too long, unless it was a topic of interest to the person. This is a more nuanced finding than "people like shorter articles," so it lost the political battle at the expense of the readers and ultimately, the company itself.
Data Trap #2: We don’t have enough conversions, so…
Sample size is a problem for marketers that test. They’d like to compare results based on a hard metric like conversion, cost of acquisition, or customer lifetime value, but there isn’t enough volume. This is particularly common in B2B, where closed deals are rare and communication between sales and marketing is endangered.
The solution for many is to look at more plentiful measures that they hope correlate with financial metrics. The most common choices tend to be search and email clicks. That’s a problem because very often the aspects of an ad or email that generate a click aren’t those that make a sale down the road. This leads to optimization efforts that actually depress conversion in favor of more page views, clicks, etc.
The solution for some has been to seek a middle ground. Some companies have found that signs of true engagement are a better indicator of future sales than traffic measures. Examples of this include time on site, repeat visits, white paper downloads, the reading of certain pages, widget interaction, email sign-ups, and podcast/webinar attendance.
The next post will cover three other potholes to avoid: our lust for quantity, disrespect for our own time, and belief in the omniscience of the voice of the customer.