Enter a search term such as “mobile analytics” or browse our content using the filters above.
That’s not only a poor Scrabble score but we also couldn’t find any results matching
Check your spelling or try broadening your search.
Sorry about this, there is a problem with our search at the moment.
Please try again later.
The difference between success and failure is often in the details.
This is why the virtues of testing and optimizing are continually extolled on platforms which claim to promote best practice.
Northern & Shell is one of the UK’s biggest publishers, owning titles including the Daily Express, Daily Star, OK!, New! and Star.
Until May this year it also operated three TV stations: Channel 5, 5* and 5USA.
So at a time when publishers are struggling to adapt to the new digital world, it’s worth taking note of the way in which N&S is attempting to monetise the massive amount of user data it collects.
Stats the way I like it.
There’s been a shift in command here at stats HQ, which has left this guy (lifts hand from keyboard briefly to point at self) in charge.
So from now on you’ll be seeing a few changes around here. Number one: more terrible puns.
And that’s about it. Wouldn’t want to rock the boat too much. Now on with the statalaunch!
Creating and promoting an infographic, it’s a tried-and-tested technique that’s planned for all sorts of reasons.
But successfully getting visual content out there, online, can be more complicated than it seems.
Today sees the release of Econsultancy’s sixth Conversion Rate Optimization Report, in association with RedEye.
The report looks at the types of conversion strategies and tactics organizations are using, in addition to the tools and processes employed for improving conversion rates.
It is based on an online survey of over 1,100 client-side and supply-side digital marketers and ecommerce professionals, the highest number of respondents in the survey’s history.
Here are four findings from the report....
One of my favourite talks from last week’s Festival of Marketing was by David McCandless from Information is Beautiful.
McCandless is an independent data journalist and information designer. His passion is visualising information. Wait come back!
Communicating data in its raw form can be incredibly difficult to do and the results are often not worth the effort. Graphs and charts are boring and don’t necessarily convey their intended insight.
McCandless and his team have a mission to distil the world’s data, information and knowledge into beautiful, interesting and above all, useful visualisations, infographics and diagrams.
Though A/B testing seems simple in that you pit page 'A' against page 'B' and see which one perfoms better, figuring out whether your results actually mean anything is quite complicated.
Luckily, great minds have been working on this problem for a long time and have developed data science techniques to help.
But to benefit from their work, marketers have to understand the problems and know where to find the solutions.
Achieving a single customer view is supposedly the ultimate goal for digital marketers, but is it really feasible?
Building this complete, joined-up view of the customer is a costly and time-consuming project, and while some have made decent progress, others still have a mountain to climb.
A new report published today by Econsultancy and Innometrics investigates the current landscape for businesses seeking to obtain that single customer view, and looks at how brands are approaching the gap between digital and retail, among other challenges.
The Single Customer View report features in-depth opinions from senior-level executives working within ecommerce, online and marketing departments, from companies including Mothercare, Camelot, myHermes, EE, Clarins, Rank, Occam and Seren.
Here’s a brief summary of three topics investigated in the report...
In my previous posts about A/B testing, I made the case that you need to consider the math behind A/B testing, or risk having invalid, or even wrong, results.
My first suggestion is to use sample sizing, but that requires a lot of tests.
Here's how to do something similar without nearly as many.
Almost two decades ago, Jeffrey and I started evangelizing the notion that your conversion rate is a measure of your ability to persuade visitors to take the action you want them to take.
Good companies know how to persuade visitors, but legendary companies better understand their visitors and their desires, and do more than simply satisfying those desires.
Great companies find ways to delight them along their journey. This is sometimes labeled as 'flow' in the UX world.
In other words, conversion rate optimization is a critical discipline, but by itself, will it be able to transform a good company into a legendary one?
Everybody talks about the need to provide quality content on your site if you want to rank well in searches. But how do search engines identify quality content?
Successive Google algorithm updates (culminating in the recent Panda 4.1) aim to refine results so that they match the intent of the search query and deliver the most comprehensive, accessible and well-written answer.
A/B testing is now an integral part of digital marketing.
But the tests can produce the wrong results if they are not conducted correctly. Here is part one of a three-part series about how you can use data science techniques to avoid making big mistakes with your A/B tests.