{{ searchResult.published_at | date:'d MMMM yyyy' }}

Loading ...
Loading ...

Enter a search term such as “mobile analytics” or browse our content using the filters above.

No_results

That’s not only a poor Scrabble score but we also couldn’t find any results matching “”.
Check your spelling or try broadening your search.

Logo_distressed

Sorry about this, there is a problem with our search at the moment.
Please try again later.

In the past few months I've rounded up a number of case studies which show the benefit of A/B testing for web optimisation.

However it's been pointed out to me that I've neglected to mention how site owners should go about setting up an A/B test in the first place.

So to find out, I spoke to three optimisation experts to find out which tools they use on their own sites or when setting up tests for their clients.

Read on to discover their recommendations, or for more information check out our blog post that shows how small copywriting changes can lead to big increases in conversions.

Furthermore, for an in-depth look at this topic download the Econsultancy/RedEye Conversion Rate Optimization Report 2013.

The research looks at the types of conversion and measurement used, as well as the tools, strategies and processes employed for improving conversion rates.

1. Which software do you use most often for A/B testing and why?

Stuart Mcmillan, deputy head of ecommerce at Schuh

Google Analytics and Optimizely. GA for test planning and hypothesis and Optimizely for running the tests; I probably don’t need to explain why we use GA for analytics. We chose Optimizely as a testing partner for a number of reasons:

  • The cost structure is transparent and compelling.
  • The user interface is well thought out, so it’s easy to operate. It also works by creating jQuery and allows you to see what it’s generating, so you can tweak the code as necessary.
  • They’re a great bunch, a pleasure to work with, always happy to give support
  • We use the Optimizely Platinum package which gives us role-based permissions. This feature gives us the ability to allow pretty  much anyone to create tests but only more experienced testers can put the tests live. It’s my stated intention to democratise testing and have as many people as possible creating tests.

Finally, I should mention that one of the things we do as part of site optimisation is to use WhatUsersDo to test a page or journey and help come up with test hypothesis.

Alex Harris, optimisation expert and founder of AlexDesigns.com

The software usually comes down to the size of the company of the company and their budget. For most clients we use Visual Website Optimizer and Optimizely. But many medium to large companies are using Hubspot and some use Marketo.

We also do a lot of A/B testing using email systems like MailChimp and iContact. For small lead generation and landing page projects we also use LeadPages and Unbounce.

Matt Lacey, head of optimisation at PRWD

We use Optimizely for pretty much all of our A/B testing. Having considered other testing tools, the speed of deployment, flexibility, control and advanced feature set available, made Optimizely a clear choice for our business.

The tool itself is really intuitive to use. Getting simple experiments up and running is a quick process. As well as a solid basic feature set, there is a wide range of advanced configuration settings that allow you to carry out some really clever experiments.

My personal favourite feature is the ease with which we can push test data into Google Analytics and create custom segments for test participants to evaluate their behaviour in greater detail. 

Dan Barker, independent e-business consultant

I mostly use Optimizely, Google Content Experiments, tools that are built-in to the ecommerce platform, or whichever other tool the client happens to be using. In general, what you do is more important than the tool you use, but there are pros/cons of each:

  • Optimizely is very straightforward, easy to use, you get 30 days free trial, it integrates with analytics packages very simply, and it's suitable for people to self-serve even if they have very little technical know-how.
  • Google Content Experiments is good largely for landing page optimisation. Google used to have a separate set of A/B (& multivariate) testing tools called Google Website Optimizer, but it was killed off and partially folded into Google Analytics, rebadging it 'Content Experiments'. Unfortunately, it folded the worse of the two tools, meaning that Content Experiments is only really appropriate either for very small sites, or for landing pages on large sites. You need a tiny, tiny bit of technical know-how to set it up, but it presents the results really nicely and lets you segment them by different audience types, different marketing channels, etc.
  • Top-secret Google Content Experiments Cheat. If you have a decent front-end developer, Google's Content Experiments does have a neat extra feature that makes it much more flexible: the Content Experiments API. There's some decent info on that here and here.

2. If you had zero budget for A/B testing an ecommerce site what option would you recommend? 

Stuart Mcmillan, Schuh 

Optimizely’s basic package is inexpensive and it also offers a free trial. I’d say, run a trial, hopefully you’ll be able to get a result that can demonstrate an ROI for the basic package cost. 

Google Content Experiments is also a great free product and it’s incredibly powerful. 

Alex Harris, Alex Designs LLC

I tend to recommend Visual Website Optimizer for any budget. I like that it's easy to use and that you can set up the pages to track revenue and other KPIs. Plus the heatmaps and flexibility of the product make it a winner. 

Matt Lacey, PRWD

One of the great advantages of improving onsite conversion is that it can reduce the acquisition cost of paid advertising per conversion, allowing you to reduce costs, or do more for the same budget.

Therefore, I would make the case to temporarily divert some paid search/advertising budget into on-site conversion optimisation, allowing you to demonstrate the value of testing and ultimately send traffic to a better converting site. 

Dan Barker

Here are a couple of options:

  • Use Google's Content Experiments.

It's free and, even if less flexible than other tools 'out of the box', you can still do loads with it, learn a lot, and - perhaps - build a case to get some budget for further testing.

  • Do more research 

User testing, surveys, analysis using your analytics tools. A/B testing helps you to understand which changes have had a positive/negative impact on your results, and therefore which to keep and which to ditch. Doing a decent amount of research, and analysing what's happening on your site/in your marketing properly can often reap similar rewards in understanding which changes will have positive impact.

  • Do some 'pseudo tests' instead.

Though people don't get excited or talk about it as much as A/B or multivariate testing, it's fairly easy to do tests without any technology. For example, let's say you're a retailer with 1,000 products. There are lots of things you could test related to those products: copy, imagery, pricing, etc. It's really easy to test lots of that with no need for technology.

For example, pick out your top 20 "most viewed" products, rewrite the copy for half of the top 20 with a particular hypothesis in mind (example: "Our product copy is really short - writing 500 words per product instead of 30 may increase sales"). Publish the new copy, and then compare any increase in view:buy ratio for those products.

Compare that increase against the other half of your top 20 products (your control group), and also compare the performance before and after the change for each of the groups. 

This is obviously nowhere near perfect, but if carried out carefully this kind of thing can give you the ability to test hypotheses without much cost or technology.

3. What is the most surprising outcome you've seen from an A/B test? 

Stuart Mcmillan, Schuh      

We carried out a test on our tablet site product page call-to-action, as we wanted to check the wording of our ‘check & reserve’ CTA (blue button below). We found that the less ambiguous term ‘are these in my local store’ dramatically increased the number of people using the feature.

However, the really interesting part was that we actually got more people ‘adding to bag’ too. By clarifying one button, we made the other CTA more user-friendly by removing the ambiguity.

Alex Harris, Alex Designs LLC

The most surprising thing I have seen from doing A/B testing is that the client needs to consider seasonality and make changes quickly to run a lot of tests.

It is not helpful to just run one or two tests over a period of a few weeks. You will get the biggest lift from segmenting your traffic and doing many tests at once. The more tests you can run at once the quicker you can make changes and iterate using the feedback.

Once you have a working testing strategy you can then expand to testing AdWords, Facebook marketing and email sends. By creating a clear map of the winning (or losing) elements, you can then apply those same findings to your banner ads, emails and social media.

Matt Lacey, PRWD

In 2013 some of our most surprising results came as we tested the impact of Unique Selling Proposition (USP) bars, usually with three key messages below the primary navigation.

In one example we saw a lift of 4% in sales across the site for a large European retailer, simply by changing the messages in the USP bar. The key to this test was that the messages that won had come directly from the mouths of customers, from user research prior to developing the hypotheses for the test. 

Another of my favourite tests from 2013 was one we ran to find out which features of an educational subscription service were the most compelling to students, with the aim of increasing subscriptions.

We ran an experiment with five versions of a feature table where we simply changed which features where included in the Free or Premium service.

By implementing the winning version we lifted premium subscriptions from this page by 185%. Even more valuable than the lift in conversions, we learned a lot about the value of different features to users.

Dan Barker

Here are three general things that often come up:

  1. The most aesthetically pleasing version of something is often not the version that performs best in tests.
  2. Sometimes big changes have very little impact, particularly where visitors are highly motivated to buy or where the brand is very strong. User experience sometimes matters less than you'd expect.
  3. There are piles and piles written about A/B testing buttons and checkout processes, but quite often it's the bits before that which make the bigger difference (and that includes the marketing channels that brought traffic).
David Moth

Published 13 January, 2014 by David Moth @ Econsultancy

David Moth is Editor and Head of Social at Econsultancy. You can follow him on Twitter or connect via Google+ and LinkedIn

1690 more posts from this author

Comments (6)

Comment
No-profile-pic
Save or Cancel
Avatar-blank-50x50

Matthew Lawson

I use Maxymiser as we found the reporting to be more robust than other providers and we were able to reconcile the numbers to our analytics provider. So test the testing solution.

You need confidence in the numbers and doing this reconciliation is important to have confidence in the conclusion. You'll be surprised how many people don't do this.

almost 3 years ago

Avatar-blank-50x50

Jonathan Welsh

I think the most effectual ecommerce platform for small business is Magento. Magento have a free edition for small business and have lots of free extension for providing a grand shopping experience. "Small business" term has a very vague definition, so if we are talking about sole proprietorship craftsmen, artists, artisan food makers - Magento is really too much to learn and handle. Apart from these, there are few points that one needs to have them in their ecommerce websites In order to make it a huge success. Here are few tips for an ecommerce website http://www.cuecommerce.com/top-5-areas-to-test-for-ecommerce-sites

almost 3 years ago

Avatar-blank-50x50

Tom Waterfall, Director of Optimisation Solutions at Webtrends

At Webtrends we’ve found that our more mature conversion rate optimisation customers require a solution that addresses the need for both simple and very advanced testing and personalisation projects.

One of the more surprising results I’ve seen recently from a simple multivariate test was run on the mobile site product pages of one of our retail customers’. Among the many different variables/factors tested, the most influential element was reducing the font size of the price approximately 5 times. Surprisingly, this drove up mobile sales by a statistically significant 3%. Relative to desktop, it is still fairly early days for testing on mobile and tablet, and we often see some very surprising results on these devices.

In terms of a more advanced test, the most surprising result I’ve seen recently has got to be the result we saw when revealing ‘persuasion’ messaging on the vehicle results page of a car rental site e.g. “x visitors are currently looking at this pick-up station” and “this pick-up station was last booked y minutes ago”. These small pop-outs were tested in isolation and in combination – and both resulted in a significantly negative impact on bookings. No matter which way we sliced the data (device, source, vehicle type, and more), the impact was negative. There are plans to follow this up and augment the test in different ways to explore this behaviour further.

On a final note, I couldn’t agree more with Matt Lacey – USP testing is critical and the results can vary quite significantly depending on the position of the elements, messaging, in combination with the vertical.

almost 3 years ago

Avatar-blank-50x50

Gabriel

We've seen this a lot:

"The most aesthetically pleasing version of something is often not the version that performs best in tests."

We have a client who is constantly trying to create a better looking subscription form that beats the control version(which is not that aesthetically pleasing), but the control always wins without any problems.

we've written about the experiment here: http://www.activecampaign.com/blog/3-lessons-we-learned-from-our-clients-about-design-vs-content/

almost 3 years ago

Avatar-blank-50x50

Prashant Chambakara

Good article. A/B testing also known as split testing is nothing but method of optimization of websites. This can be used if there's less traffic whereas for huge traffic companies prefer multivariate testing. I've written my experience on these A/B and Multivariate testing here
http://tech.pro/blog/1710/ab-and-multivariate-testing--a-quick-go-through

almost 3 years ago

Avatar-blank-50x50

Christian Virtsetti

To optimize the commercial site I use Google Analytics and adwords, since no analitekse some functions, but rather took them to Google adwords, for example - a source of traffic.

almost 3 years ago

Comment
No-profile-pic
Save or Cancel
Daily_pulse_signup_wide

Enjoying this article?

Get more just like this, delivered to your inbox.

Keep up to date with the latest analysis, inspiration and learning from the Econsultancy blog with our free Daily Pulse newsletter. Each weekday, you ll receive a hand-picked digest of the latest and greatest articles, as well as snippets of new market data, best practice guides and trends research.