{{ searchResult.published_at | date:'d MMMM yyyy' }}

Loading ...
Loading ...

Enter a search term such as “mobile analytics” or browse our content using the filters above.

No_results

That’s not only a poor Scrabble score but we also couldn’t find any results matching “”.
Check your spelling or try broadening your search.

Logo_distressed

Sorry about this, there is a problem with our search at the moment.
Please try again later.

Here, we speak to Lovefilm's digital and usability product manager Craig Sullivan about the work the DVD rental company has been doing in multi-variate testing (MVT).

Craig is a bit of an evangelist on MVT - a hot topic right now - and has some great insights into when and how best to use it, the differences with A/B testing and how to intepret results. 

-----------------------

So how long has Lovefilm been testing for?

We've been A/B testing all sorts of stuff for a few years now but we recently found a really good Multi Variate testing solution from Optimost (www.optimost.com) that has been getting us good results. We're trying to do loads more tests because the gains are so fundamental to our bottom line. It's also useful for us to assess the impact of changes we sometimes make rather than trying to guess the outcome.

-----------------------

What benefits does A/B testing get you?

Well, A/B testing is what most people start with and it’s great for testing big, simple, different things; usually versions of a page or an element on a page.

The upside for A/B is that it isn't complicated to build and run A/B tests; it is really good for redesigns, say a new home page or checkout process; and you can get results fairly quickly if you have enough site traffic. When showing results, people grasp the concept readily.

The downsides are that everyone has variations in traffic. The time of day, day of week, seasonality, different promotions, TV adverts, search traffic are all factors in your traffic. Running A/B tests at two different times may get completely different results. One week ‘version A’ wins and next week ‘version B’ wins, so you need to be aware of the traffic composition, which can cause problems getting the right result.

Also, A/B tests usually have only a few variables tested at a time so your test throughput is limited. Another issue is that when you get the result, how do you know which elements worked? 

A whizzy new widget on the homepage might increase conversion by 10% but the copy text you used drops that by 5%.  You might think you are winning because of a 5% lift but you can’t see how the individual components are contributing. The final downside is that you can’t always work out which elements, say in a new design, are the ones driving a change in your KPIs.

-----------------------

So how does MVT help?

You can test large amounts of variables at a time; potentially millions of them. If you imagine breaking a new homepage design into all the elements - the copy, buttons, graphics, products, widgets - then MVT allows you to try lots of potential replacements for these elements.

You could try six different button designs, 10 bits of copy text, five product pictures and four different layouts. MVT lets the best combination of these elements bubble to the top of your results. You can also design tests across several pages which lets you obtain serious improvements to signups, sales, checkouts etc.

The thing I love about this kind of testing is that you can see how weak or strong the interactions between elements are having - this builds up a really useful knowledgebase of interactions that resonate for your visitors or customers.

-----------------------

What benefits are you getting from MVT?

There are obvious financial benefits - because of our fast growth as a business, we drive a lot of traffic to our site. We recently did a test with 192 elements and found two major changes that increased conversion by 10%. This gets us substantial additional revenue each year.

The other brilliant side benefit is that time spent endlessly tweaking designs or choosing 'the best design' is now reduced. I spend less time in long pointless meetings arguing about whether we should use dropdowns or whether the button should have shadows or what colour an element should be - our customers let us put all the ideas into a bucket and see what things work. It really frees up more time internally to focus on what goes into a test, rather than coming up with one single idea.

Optimost has been really great for us because the overhead for our developers in setting up a test is minimal. They help us prepare and run tests and have dedicated staff for each stage of the process - account manager, analysts, QA testers and web developers. It means that I can work on the test designs and results without worrying about our ability to execute them. We used to do a lot less tests when we had this overhead and a higher test throughput gets us better results, more iterations and more money!

-----------------------

How did you sell MVT internally, and what sort of budget did you need to allocate to it?

The first thing that we did was to identify a page where many business units would gain from optimisation. We then contacted Optimost and asked that we conduct a pilot test.

This allowed us to see how the process worked, what internal resource was required and how the sales pitch translated to actual operational experience for all the people involved.

In terms of budget, we knew what the monthly costs would be and so picked a piece of nice, yummy, low hanging fruit to test with. Our ROI exceeded the costs by many multiples. 

My advice to people is to ask vendors for a free test (or conduct your own, using tools like Google optimiser) and make sure you evaluate all aspects of the supplier, the tool, their process and methodologies that are used for the entire lifecycle of testing.

-----------------------

How do you prepare for these kinds of tests?

We have a lot of inputs to the test. These are things like web analytics data, usability testing, customer surveys and a whole heap of competitive and analytical data from various sources.

We also have great people here in the company bursting with great ideas for these tests... It really is nice to have space for everyone’s idea now, rather than relying on what Avinash Kaushik calls the HiPPO (Highest Paid Persons Opinion). This change in power over new stuff puts it back with our customers - they show us what works the best for us - everyone is happy.

We are also learning to be less conservative when testing, trying out very different things or radical new elements. MVT frees us up to not only try lots of things but often wide variations in the creative, with sometimes surprising results.

The concept of 'no such thing as a dumb idea' comes from doing these tests and it is hard to predict the outcomes. I estimate that I probably guess the results wrongly over 80% of the time - I am not the customer here!

-----------------------

So how do you run tests on your site?

We have a complete brainstorm of all the ideas and select the candidates. We then issue a proposal to Optimost who then formally respond. After we approve, the test is built (very quickly), QA tested and launched on the site.

Optimost have a great methodology and work ethic by monitoring tests very closely, suggesting where we may drop things that aren't working, adding new elements or changing things based on early stage feedback. If you are testing 2m page elements, you need help whittling down the poorly performing creatives and this is where having dedicated analysts cuts the time on tests. For each item dropped in a 2m variable test, you halve the size of the test and the time needed to get statistically significant results.

After the results are in, we present the feedback to the business owners and feed this into the next iteration. It will take a long time before we think we'll even be close to optimal in key areas on our site and anyway, customers and our offers change constantly so this kind of testing becomes something you just do forever.

-----------------------

How long do you need to run these tests for?

This is down to your traffic figures, the mixture of promotions you are running and any patterns or seasonality on your site. We know that weekend traffic is very different for customer acquisition work than weekdays so our tests always cover at least one full week. For your site, the mix may be completely different. If your traffic is low then it will take a long time to run a large test so having someone to help whittle the test size down is extremely important.

-----------------------

So where do you go from here?

Keep plugging away and evangelising about this form of testing. I'm working hard with people to get them into a mindset of variables rather than page designs - to get them to think about elements and their interactions rather than the page as a big lump of stuff.

We are also planning to do experience testing using MVT, where we optimise and present results depending on the visitor. For example, we may find that people who want to rent console games from us respond to a different customer experience than someone only interested in DVD rental.

We’re going to have a site that can dynamically serve the best designs tailored to the visit or promotional source, which is a pretty exciting area for me.

We’re also discussing doing MVT testing of emails with Optimost, something that will help us make marketing acquisition and retention spending work more efficiently.

-----------------------

What are the key lessons you've learnt?

MVT testing is easy and everyone should be doing this. There are solutions that are completely free all the way up to thousands of pounds so there is plenty of choice in the market.

The biggest things personally for me are how it radically changed our design and approval process and how it removed the guesswork from site changes. We shouldn't be trying to 'guess' customer behaviour and we just shrug now, accept this fact and get on with letting the customers settle it for us.

-----------------------

Related research:
Measurement, Analytics and Optimisation Roundtable Briefing Notes

Avatar-blank-50x50

Published 24 June, 2008 by Richard Maven

529 more posts from this author

Comments (0)

Comment
No-profile-pic
Save or Cancel
Daily_pulse_signup_wide

Enjoying this article?

Get more just like this, delivered to your inbox.

Keep up to date with the latest analysis, inspiration and learning from the Econsultancy blog with our free Daily Pulse newsletter. Each weekday, you ll receive a hand-picked digest of the latest and greatest articles, as well as snippets of new market data, best practice guides and trends research.