{{ searchResult.published_at | date:'d MMMM yyyy' }}

Loading ...
Loading ...

Enter a search term such as “mobile analytics” or browse our content using the filters above.


That’s not only a poor Scrabble score but we also couldn’t find any results matching “”.
Check your spelling or try broadening your search.


Sorry about this, there is a problem with our search at the moment.
Please try again later.

Thanks to the 'trackability' of digital media and the rise of Big Data, more and more companies are hoping that decisions they once made on gut instinct or educated guesstimates can and will be made on hard data.

Which, in theory, is a good thing: data-driven decisions should enable businesses to understand the dynamics in their market and use that knowledge to better serve their customers.

No metrics, 'mo problems

The data behind data-driven decisions, of course, is of limited use in its raw form. You'll probably never find a company where an executive made a decision because he personally sifted through 5m tweets.

Instead, we define metrics and build software that turns data into numbers that humans can easily digest. Unfortunately, given the huge volumes of data businesses increasingly have access to, one of the biggest challenges companies face today is choosing the metrics worth adopting. In many cases, the ideal metrics simply don't exist. In the social media realm, for instance, establishing metrics has been a real source of frustration. Complicating matters: companies will in many cases have to wait on the owners of first party data, like Facebook, to make their dream metrics a reality.

The math behind the metrics

The fact that in most channels and markets, there is a real desire for better metrics doesn't mean that half-decent metrics don't exist. They do, and as imperfect as many of them are, the good news is that companies are using them to build businesses that are capable of making more informed decisions.

There's bad news, however: the math behind the metrics is often flawed. What's worse: this is true even for common metrics that are seen as being as basic as they come.

Take, for instance, customer lifetime value (CLV). This is a common metric that can and is frequently used in a number of markets, but regardless of the market you choose to look at, you're bound to encounter companies calculating customer lifetime value in the most flawed way possible: taking total revenue and dividing by total number of customers.

Such a flawed approach isn't seen everywhere; fortunately many companies are more sophisticated than that. But this doesn't mean that the approaches they take are much better. A company might, for example, calculate the average revenue per user (ARPU) on a monthly basis and multiply it over a desired period (eg. 36 months) to calculate the value of a customer over that period. But if that company has a lot of new customers, for instance, the value will naturally be skewed and therefore of limited use.

Moving too fast

This begs the question: if lots of companies can't get metrics like CLV and ARPU right, what are the odds that they'll get more complicated metrics, both existing and not-yet-developed, right?

Arguably, part of the problem is that we place more value on numbers than the process by which those numbers are calculated. The math is not sexy; the data to which the math is applied and the output of the math is, so that's what we focus on.

But that's dangerous given the velocity at which data is being collected today and the credibility that businesses increasingly give to metrics in the decision-making process. After all, decisions based on numbers that don't mean anything, and that are the product of a flawed calculation, could very well prove to be worse than decisions made on instinct. So as companies seek the answer to the question "What should we do?" in data and the numbers that are generated by data, they might want to slow down and make sure that the math behind the numbers is really as solid as they've come to believe.

Patricio Robles

Published 11 February, 2013 by Patricio Robles

Patricio Robles is a tech reporter at Econsultancy. Follow him on Twitter.

2451 more posts from this author

Comments (13)

Save or Cancel
Malcolm Duckett

Malcolm Duckett, CEO at Magiq

Patricio, you are poking at a key issue here - too often we see people making decisions on the basis of seriously flawed numbers - or worse still; no numbers at all!

But your suggestion that geting a accurate CLV is a problem is no longer true.... cloud based marketing automation solutions (like Magiq) will happily record every sale from every visitor and record CLV as part of an individual's profile - and then let's you target them based on this (and a lot of other parameters)... and with Prices from $800/month price is no-longer a barrier either.

BigData without the headaches....

about 4 years ago

Andy Headington

Andy Headington, CEO at Adido Limited

Really interesting post Patricio and one which I think needs more discussion these days. They way that metrics are calculated is key and is usually best done with plenty of time and experience by those involved to make sure that the right things are being looked at in the right way.

The other issue we to throw into the digital mix espicially is around accuracy of even the raw numbers. Having used multiple analytics packages over the years, even comparing apples and apples is sometimes hard due to the sometimes never ending list of technical considerations we have to deal with (e.g. browsers, devices, multiple logins, antivirus/malware progs etc etc)

There is a lot to be said for experience and gut instinct, even in the digital world.

about 4 years ago

Paul Mead

Paul Mead, Founder & Managing Director at VCCP Media & VCCP Kin

You have touched on one of the biggest opportunities for marketers in my view. The problem is - its not sexy! A dashboard or a new tech solution often feels like the answer but in many cases that is simply prettier window dressing. You can make bad data look amazing! The harder part is a) getting getting the measurement right (i.e looking at what each action you measure is worth relatively and how closely this aligns with how you actually make money as a business) and b) whether or not you can act on this data (i.e if you're not going to do anything with it whats the point?).

I wrote a post for Econsultancy on "Measuring Value" back in October http://goo.gl/lTrL9

about 4 years ago


Brewster Barclay, Consultant at B. F. Barclay & Associates

Patricio, I would like to agree with Annabel's second point, even the raw data is suspect because of the vagaries of tracking technologies which have not changed substantially in 15 years and which are made worse by the mobile web. 7 years ago we did the grunt work of comparing the request data for two different analytics tools with the Clickstream Technologies data, which was 99.9% accurate because of the technology used, and found differences of up to 10% between the two tools and with the "correct" data.

about 4 years ago

Malcolm Duckett

Malcolm Duckett, CEO at Magiq

Brewster, if you made it within 10% you were working with products at the better end of the market :-)

But to be fair, these days in-page capture can be 100% accurate - we proved this for a major windscreen replacement company last year by providing that the analytics and the back-end order processing matched 100% (every order).

So do not despair, choose your technology well, because it has changed substancially in 15 years - some of our patents are only about 4 years old...

about 4 years ago


Stefan Elliott, Managing Director at Six Serving Men

I believe that the challenge here is not getting the data. Rather its the translation of the loads of data into the right context.
To rehash an overused marketing buzzword - Is the Maths Relevant! The great thing about Digital is that you can measure everything - The worse thing about Digital is that you can measure everything!
- It's important that you have a degree of trust in the validity of data but it never will be 100%.
- CLV is a model and like any model you use will be wrong - it just depends how wrong! Models can and should be recalibrated as you develop your understanding. Accepting that you are on a Do Review Refine process means that you start from where you are at and develop.
- Too often data is misrepresented as a "fact" or mispresented out of context. Many "Stroke me metrics" are used to garnish a set position, rather than explore or test a hypothesis. Andrew Lang's quote "He uses statistics as a drunken man uses lamp posts - for support rather than for illumination" is becoming more appropriate.

Outside the Marketing department its increasingly important that Marketers are seen to be providing Maths not Myths and Analysis then Action.

about 4 years ago


Deri Jones, CEO at SciVisum Ltd

great to raise awareness. Sometimes marketers approach to numbers lacks what 6th formers are taught these days: 'critical thinking' !

Whats that Dilbert cartoon: where the boss (unusually) asks where the numbers came from: and Dilbery responds: 'lets continue anyway as if that mattered'...

about 4 years ago

Alec Cochrane

Alec Cochrane, Head of Optimisation at Blue Latitude

This is an interesting conversation, but it does overlook a point, I feel. The problem isn't the accuracy of the data, but the precision of it.

In a way, it doesn't matter if the number you are seeing is out by lots as long as when you do the measurement again you get the same figures. That way you can change things and see the impact.

By focussing on accuracy of the data you can often lose sight of the reason that you are getting the data in the first place: Insight into the user and implications of recommendations on ways to optimise and improve. Instead the person who goes with 'gut instinct' can often win out because they discredit you with lack of accuracy rather than you winning out because of their lack of precision.

Without wanting to promote my blog too much - here is something I wrote on it a couple of years ago:


about 4 years ago


Deri Jones, CEO at SciVisum Ltd

Alec, you're right that it's not wrong to have data you know is NOT accurate, but is a useful benchmark each week because you are careful to measure it the same way each week.

But there is a huge risk: all too often that data becomes treated (perhaps elsewhere in your company) as true and accurate; and business decisions are made on it.

So it's vital to label big + bold on the graph, that the data is not accurate.

I see this very often in website load testing. Because it's hard and time-consuming to create a load test that genuinely loads a site in the same way as real traffic at the latest peak: and because due to project delays the slot for load testing is often too short: the hard-pressed tech team will commonly do a load test that is full of unrealistic assumptions.

And then eCommerce Directors make decisions based on those numbers, that in reality mean nothing!

So this problem of data not being accurate or relevant: is particularly rife in the join between IT and marketing teams.

about 4 years ago


Deri Jones, CEO at SciVisum Ltd

Out of interest: anyone here even see any detailed Load test numbers, when load testing happens ?

Or is a pass/fail the limit of the detail passed to marketing!

about 4 years ago

Malcolm Duckett

Malcolm Duckett, CEO at Magiq

...and then there is the recognition that the numbers are valuless unless you are able to use them to DO something....

about 4 years ago

Anna Lewis

Anna Lewis, Google Analytics Analyst at Koozai

This is a big issue that needs to be raised often. I wrote a bit of a slant on this over on SEO Chicks after I got frustrated with survey results on adverts not actually being decent data - http://www.seo-chicks.com/2766/beef-arsenic-and-web-analytics-getting-the-right-numbers.html

It's even more important when you're trying to use the data for business decisions. Whenever you're using numbers they need to be correct, the right metrics for the situation and a big enough sample to make decisions on.

about 4 years ago


pixel art

Amazing things here. I'm very glad to look your article. Thank you so much and I am having a look ahead to touch you. Will you kindly drop me a e-mail?

almost 4 years ago

Save or Cancel

Enjoying this article?

Get more just like this, delivered to your inbox.

Keep up to date with the latest analysis, inspiration and learning from the Econsultancy blog with our free Daily Pulse newsletter. Each weekday, you ll receive a hand-picked digest of the latest and greatest articles, as well as snippets of new market data, best practice guides and trends research.