Enter a search term such as “mobile analytics” or browse our content using the filters above.
That’s not only a poor Scrabble score but we also couldn’t find any results matching
Check your spelling or try broadening your search.
Sorry about this, there is a problem with our search at the moment.
Please try again later.
Companies are collecting more data than ever about how their users interact with their websites, and thanks to sophisticated yet easy-to-use tools, techniques like A/B testing are accessible to even the smallest of businesses.
But when it comes to creating great user experiences, are companies being blinded by data?
The difference between success and failure is often in the details.
This is why the virtues of testing and optimizing are continually extolled on platforms which claim to promote best practice.
Though A/B testing seems simple in that you pit page 'A' against page 'B' and see which one perfoms better, figuring out whether your results actually mean anything is quite complicated.
Luckily, great minds have been working on this problem for a long time and have developed data science techniques to help.
But to benefit from their work, marketers have to understand the problems and know where to find the solutions.
In my previous posts about A/B testing, I made the case that you need to consider the math behind A/B testing, or risk having invalid, or even wrong, results.
My first suggestion is to use sample sizing, but that requires a lot of tests.
Here's how to do something similar without nearly as many.
Having recently published an article about why email isn’t dead, I thought it would be useful to roundup some case studies to help marketers inject some life into their own campaigns.
Hopefully they should provide some inspiration for marketers who are in the process of testing their own email messages.
The Student Room Group is one of the UK's largest sites for students, and has experienced plenty of success with a continuous CRO (conversion rate optimisation) strategy.
I spoke to director of optimisation Pete Taylor to find out more about the company's conversion optimisation strategies, what's proved to be effective, and which tools are the most useful...
For the first time in four years, satisfaction with conversion has increased.
90% of companies now claim that CRO has increased in importance, with 59% claiming it’s crucial to their marketing strategy.
The aim of this report is to provide data and a framework to help companies invest their time and resources as effectively as possible, by examining which methods and processes are most likely to yield results.
For a brief overview, RedEye has produced this infographic…
Over a quarter (28%) of companies are satisfied with their conversion rates (either 'very' or 'quite' satisfied), up by 6% since 2012 and the highest level since 2009.
Addtionally, around three-quarters (73%, up from 65% in 2012) indicate they have seen an improvement in conversion rates in the last 12 months
The fifth annual Conversion Rate Optimization Report, produced in association with RedEye, also found that the proportion of organisations who say they experienced an increase in sales conversion rates has significantly gone up, from 60% in 2012 to 70% this year.
The research, based on a survey of almost 1,000 client-side and agency digital marketers, revealed that A/B and multivariate testing, using multiple methods to improve conversion and having a structured approach are among the seven factors most correlated with improved conversion and sales...
In case you’re a few years behind the times, you will be aware that Obama’s re-election campaign was a success.
But what is less well known is the detail of the testing process behind the email strategy that helped to raise more than $500m in online donations.
At Searchlove this morning Obama's director of digital analytics Amelia Showalter gave an insight into the A/B tests that optimised the campaign's fundraising emails and the lessons that the digital team learned as a result.
Showalter said that in a tightly fought election Obama’s campaign team knew they would have to top the $750m raised in 2008.
Providing tailored product recommendations is a proven way of boosting online sales, with two-thirds of companies (66%) stating that personalisation improves both customer experience and business performance.
Speaking at a Screen Pages ecommerce event recently, Emailvision personalisation director Neil Hamilton ran through some best practice tips for how to create effective homepage product recommendation banners.
The effectiveness of these blocks can be improved using personalisation, whereby the products shown are specifically tailored to the customer based on their past on-site behaviour.
Our new Realities of Personalisation Report, published in association with Monetate, found that just 30% of businesses currently personalise their websites based on a visitor’s previous behaviour, so a majority of businesses are yet to implement the technology.
A/B/n and multivariate testing is one of the most important CRO (conversion rate optimisation) activities for continually improving your website, and yet for some it can be difficult to get started with.
In this post I’ll share three frequently asked questions we hear time and time again from our clients when just starting out with A/B and multivariate testing.
Are your landing page or product page images big enough to get the best conversion rate that you can get? We’ve seen a wide variety of marketers testing image size these days, including B2B, ecommerce and media sites.
I’m not talking about allowing your visitors to click to enlarge images. I’m talking about blowing up the size of your hero shot (the most important image on your page) so it’s much, much bigger.
Here are three examples from very different marketers to inspire you.
Be sure to share them with your design and testing team.