How do you know your website pages are performing as well as they could be if you're not testing them to make sure? You don’t, is the honest answer. 

You could be missing out on a lot of potential sales, shares, clicks, or whatever it is you’re trying to get out of your visitors.

But don’t despair, because it really isn’t a very complicated process to test and tweak your pages to get the most out of them. 

The simplest way to achieve this is through A/B testing, otherwise known as split testing, and that’s what I’ll be covering in this post. 

What is A/B testing?

In the simplest terms: You take a page on your site that you want to improve (version A) and create a tweaked version of it (version B). You then run both pages simultaneously and see which one performs better. 

The idea is that half of visitors will be shown version A – otherwise known as the control page – and half will be shown version B, the variation page. 

At the end of the test period you can look at the results for both pages and see which one performed better. 

If the tweaks yielded the desired result then you can take down the original page and make the updated one a permanent fixture. 

In my recent post about A Hume I provided a few different examples of A/B testing the clothing retailer had carried out, including updating one of its footwear product pages with new copy and images. 

Below are the before and after pictures for that experiment:

The original page (A)

a/b testing on a hume product pages

The tweaked page (B)

a/b testing a hume product page

As you can see, A Hume made some fairly simple changes to the product page, but the results were significant. 

This particular A/B test resulted in almost a 15% increase in add-to-basket levels for the modified page.

Why do you need it?

There’s an old saying, "Art is never finished, only abandoned."

The idea is that you could go on tinkering with a book or a painting or a song indefinitely, making slight improvements but never really ‘finishing’ it because there’s always something else that could be done. 

This analogy works for websites, too, except you don’t have to ‘abandon’ them like you do with an art project. Unlike a novel, you can easily make changes to a web page once it has been published. In fact, you should make a point of it. 

Why? Because if you’re not testing your pages you can’t be sure they’re performing their best. And if your website pages aren’t performing their best then you might not be making as much money as you could be. 

Ultimately A/B testing can lead to increased revenue, whether that’s directly through sales or by increasing customer engagement or content shares. 

And given that it’s relatively cheap and easy to administer, why wouldn’t you do it? 

Things to bear in mind...

Hopefully I have you on board now when it comes to the merits of A/B testing, so let’s take a look at some of the key things to bear in mind when carrying out your own experiments. 

Know exactly what you’re trying to measure

Having vague intentions of making your site work better is a great start. Hey, at least you’re actually thinking about testing and improving it, which is more than you can say for a lot of brands. 

But before you get down to the nitty-gritty you need a solid plan. 

Pick a specific page, decide what you want to improve, and then based on that you can work out what you’re going to measure to see whether page B is an improvement or not.  

A typical example might be a product page with a low conversion rate, with one obvious thing to measure being the number of people who add the product to their basket when they visit the page. 

The important thing is to make sure you know exactly what success looks like before starting any tests. 

Start with one variable

Keep things as simple as possible to begin with by only tweaking one element of the page, such as the copy or the imagery or the call to action button. 

If the initial test works then you can experiment with further tweaks, but always start off simple or you’ll lose sight of which changes are actually having an impact. 

In the example from A Hume I mentioned above, the brand first tested out the updated copy, saw a positive result, and then moved on to testing new images and saw an even bigger change. 

Testing in this staggered way means you have greater control over the results. 

Don’t drag out tests for too long

Lenny Kravitz may well believe "it ain’t over ‘til it’s over", but if he was a digital marketer he might have added the following line to that chorus:

…provided you don’t go on for more than, say, two months, because then you might begin to skew the results of your A/B test and in any case shouldn't you really be getting on with other things by now?

I mean clearly there’s a reason I’m not a songwriter but you get the point: Any longer than a couple of months and your results could start losing their validity. 

Two months is a long time in the fast-moving digital world, particularly with monthly fluctuations in traffic and so on, so too many variables start coming into play after a while. 

That said, you’ll want to run the test for at least a week or two to get any kind of significant result. 

Never stop testing

Back to the ‘art is never finished…’ analogy: a website is never ‘complete’. You got a positive result after making some changes? Great. Now test something else. Test the same page again, even. 

Always be tweaking and tinkering with your site to make sure you’re getting the absolute best performance from it. Even if pages are performing well, you might be able to get even more out of them with just a few simples changes. 

Think of your website as an infinite work-in-progress and never pass up an opportunity to improve it. 

Tools that can help

All of these tools are either free or very reasonably priced, so even small businesses should be able to use them. 

Google analytics

It goes without saying (but I said it anyway) that Google Analytics is your friend when it comes to A/B testing, particularly in the planning stages when you’re trying to work out which pages you want to target. 

Opitimizely

You don’t need to be a techie to use this tool. It easily integrates with most analytics packages and it’s extremely intuitive so you’ll be able to get your experiments running in no time. 

The tool makes it really easy to tweak your pages without having to always rely on developers, as you can see from the screenshot below. 

optimizely interface web page editor

There is a free starter plan with limited features, but the premium package is relatively inexpensive. And it’s always easier to ask for budget if you can show some kind of return on the free version first.   

Visual Website Optimizer 

Visual Website Optimizer (VMO) enables you tweak, optimise and personalise your website, again without the need to rely on tech-savvy developers. 

There’s a free trial, but with plans starting from $49 per month this platform should be well within most people’s price range. 

This site also has a number of great free tools you can use without an account, including its Landing Page Analyzer that rates your page based on five different parameters. 

There’s also a tool called the A/B Testing Duration Calculator. You can probably guess what this is for already, but essentially it helps you decide how long to run a split test for in order to get the most significant result. 

VMO a/b testing duration calculator

Then there’s the A/B Testing Significance Calculator, which tells you whether the results of your split test are worth having a celebratory drink over. 

VMO a/b split test significance calculator

Conclusion: get addicted to testing

I can’t stress this enough: you can read all the so-called expert opinion about digital marketing best-practice you want, but the only real way to find out what works for your site and your customers is to keep on testing and optimising. 

With the tools I mentioned above, A/B testing costs relatively little time or money, yet the impact on your revenue line over time could be extremely positive. 

Get addicted to testing your site and always believe there is something that could be improved (because there definitely always is). 

If there’s anything I’ve missed in this guide, or if there’s a particular A/B testing tool you feel deserves a mention, please let me know in the comments below. 

For more on this topic, read:

Jack Simpson

Published 26 November, 2015 by Jack Simpson

Jack Simpson is a Writer at Econsultancy. You can follow him on Twitter or connect via LinkedIn.

252 more posts from this author

You might be interested in

Comments (1)

Keith Freeman

Keith Freeman, Technical Lead: Frontend & Mobile at Venda

Nice summary, it's worth pointing out that with any 3rd party script you should consider the performance (site speed) impact it's having. And by that I mean the user's perceived load time (stuff appears) not just the total page load. You don't want to end up lowering conversion due to site speed while trying to improve it through interface changes.

- Check what happens when the service is down. In many cases these scripts are synchronous but will time out. You can test using chrome extensions like SPOF-O-Matic. It allows you to fake a 3rd party script going down.

- Avoid during Peak, why add the risk.

- Make sure the impact of the script fits within your performance budget (metrics to keep the site fast)

Site performance should be part of every conversation when adding/removing features and scripts.

almost 2 years ago

Comment
No-profile-pic
Save or Cancel
Daily_pulse_signup_wide

Enjoying this article?

Get more just like this, delivered to your inbox.

Keep up to date with the latest analysis, inspiration and learning from the Econsultancy blog with our free Digital Pulse newsletter. You will receive a hand-picked digest of the latest and greatest articles, as well as snippets of new market data, best practice guides and trends research.