{{ searchResult.published_at | date:'d MMMM yyyy' }}

Loading ...
Loading ...

Enter a search term such as “mobile analytics” or browse our content using the filters above.

No_results

That’s not only a poor Scrabble score but we also couldn’t find any results matching “”.
Check your spelling or try broadening your search.

Logo_distressed

Sorry about this, there is a problem with our search at the moment.
Please try again later.

Website optimisation teams are so much more effective when UX/Usability Consultants and AB/Multivariate Testing experts work closely together.  

We have seen first-hand the difference this makes to the conversion rate optimisation process.

Why is the relationship so important?

There are few who would dispute the importance of testing new website optimisation ideas (hypotheses) in a live environment with real users using tools such as AB/Multivariate Testing software, before final implementation on the website.

But the optimisation ideas have to come from somewhere, and who is better qualified and has a greater understanding of user behaviour than the User Experience or Usability consultant?

Hundreds of hours spent watching and listening to users, sharing their pain and delight during website interaction naturally results in a good understanding of what makes users tick.

As optimising a website is all about influencing the behaviour of users in line with business objectives, their role in the orchestration of AB/MVT experiments is so important.

A difficult relationship

A Multivariate Testing (MVT) salesman a few months ago at a London conference said to me “We enjoy proving usability consultants wrong!”  

Whilst this is quite an extreme viewpoint, it’s clear that not all AB/MVT professionals are sold on the idea of usability. And in a similar way, many of the usability consultants I have worked with over the years have been just as sceptical about AB/Multivariate Testing.

In a strange way it’s as if both parties see MVT as a substitute for usability and as a result are a quite defensive about the whole subject.  In fact these two disciplines are totally complimentary. 

So in an attempt to heal the wounds like any relationship counselling session, the most important thing is to air each other’s issues and concerns.

So here are the types of comments MVT companies say about Usability:

  • “Usability testing is not statistically significant due to the small sample sizes”.
  • “Participants do not behave naturally in a usability lab when there are cameras pointing at them and a moderator asking them questions”.
  • “It’s so expensive as you need to compensate participants".

And here are some typical Usability and User Experience comments about AB/Multivariate Testing I have heard over the past few years:

  • “Where do the MVT companies get their optimisation ideas (hypotheses)? They don’t seem to be based on evidence or insights?”
  • “How can they create the design variations (recipes) for optimising the website if they don’t have User Experience Professionals, Interaction Designer or User Interface Designers? And they don’t do Usability?”
  • “Everything just seems very random; it’s as if they’re just throwing things against a wall and seeing what sticks”.
  • “How could a change to the colour of a button have resulted in huge conversion rate increase unless there was an underlying issue with the button in the first place?”

(I realise each list is quite short at the moment so feel free to add your own comments)

It’s just a question of language 

One of the reasons why Usability and MVT experts struggle to understand each other as well as they should is their use of language.  

They often talk about the same things in a completely different way. And this language barrier as well as their different viewpoints can again make it a little more difficult to get along.

The following are just a few examples of common terms used in the world of Usability and their MVT equivalent:

Usability language Equivalent MVT language
Users Visitors
Recommendations Hypotheses
Design solutions Recipes, Combinations
Issues Pain-points, Blockages
User journeys, paths Goal funnels, Fall-out reports
Personas Segments
Improving the user experience, Making the site easier to use Improving conversion rates, Reducing drop-off
Severity of issues Business impact, Confidence levels

The following Wordle Word Clouds are snapshots of the Wikipedia page content for "Usability" and "Multivariate Testing".

Try finding the word “user” in the MVT word cloud or visitor in the Usability wordcloud!  Also “Design” is so prominent in the “Usability” wordcloud, and yet so hard to see in the MVT wordcloud.

Wikipedia content word-cloud for "Usability":

Wikipedia content word-cloud for "Multivariate Testing":

The future 

UX/Usability guys are wising up to the idea of AB/MVT as a very useful tool for evaluating and fine-tuning all the great recommendations which come out of usability testing sessions and other qualitative research such as surveys, remote usability testing etc.

Websites such as whichtestwon.com have definitely helped raise the profile of AB and MVT in the user experience arena, and the fact that so many of the winning recipes are due to enhancements to the user experience just as often as they are due to persuasive design patterns and messaging, again reinforces everything the UX and Usability consultants believe in, and makes them want to use the tool.

And the fact that the tools are so readily available to use increases their exposure to the wonders of MVT. Tools such as Google Website Optimizer and Visual Website Optimizer are so accessible, there is no excuse not to have a go and learn all about them, even if you don’t know your “recipes” from your “hypotheses”! Check out the Econsultancy MVT buyers guide.

MVT companies are now showing a greater interest in usability, which suggests they are realising its importance, or at least have realised it’s a good idea to talk about usability to prospective clients to allay their clients’ fears that the MVT house isn’t a user centric company.

Conclusion

Usability and AB/MVT specialists can and should work together. The disciplines are totally complimentary when the individuals from either camp except the limitations of their own tools and learn to recognise the strengths of the others’. 

Finally working in silos is not the way forward. After all, we’re all working towards the same goal; to make your online business more successful.

Please let me know your thoughts and feel free to disagree with anything I have said.  Would love to hear from MVT companies who are usability testing fans?  And usability people who like nothing better than split testing or MVT?

Chris Gibbins

Published 26 August, 2011 by Chris Gibbins

Chris Gibbins is Director of User Experience & Optimisation at Biglight and a contributor to Econsultancy. 

3 more posts from this author

Comments (7)

Comment
No-profile-pic
Save or Cancel
Avatar-blank-50x50

Richard

They absolutely should work together to help create a 'better' (both for the customer, and the site owner) website. Usability may highlight the areas of a site that aren't working within a small sample of data, MVT should then be used to prove that is a problem over a larger sample, and help find the solution.

about 5 years ago

Avatar-blank-50x50

Andrew Lockley

Working in conversion rate optimization for SMEs shows me that A/B and usability are two sides of the same coin. I'd never want to finalise a change without testing it in combat. Usability is the quick and dirty intelligent guesswork that shortcuts to some good soltions, A/B is how you prove you're right - and get clients to sign cheques.

about 5 years ago

John D'Arcy

John D'Arcy, Practice Director, Analytics & Insight at Foviance

Chris. It's definitely been common for qualitative and quantitative researchers to be historically wary of each other but I think those barriers have come down and I haven’t met a UX consultant in the last 2 years who hasn’t wanted to complement their work with analytics data.

Though there are some great success stories out there for MVT it is actually still pretty new for most companies and lots of people I talk to have struggled to develop programmes that complement the strategic development of a site. Much of that strategy should come from the work of a UX consultant, providing ideas for site development from UX tests, expert audits and industry reviews. Then the optimisation specialist can get started, deciding with the help of analytics which tests are going to deliver the most uplift and (importantly!) which are practical to implement. I think the two disciplines working together delivers a much more robust approach to creating an optimisation programme and also it’s a lot of fun when you get a UX consultant and MVT specialist in a room sparking ideas off each other.

It's natural for MVT suppliers to be experts in their own technology but not always in complimentary technology or techniques such as web analytics, UX and customer insight. So I find they sometimes miss the great ideas that a UX consultant and analyst working together will deliver. Without UX experts you are likely to miss some of the best opportunities for testing but without analytics and MVT you'll be unlikely to understand the revenue opportunity, optimise as quickly and measure the results of any changes.

about 5 years ago

Avatar-blank-50x50

Lee Duddell

Chris

Great article. We've found that usability testing and content testing (MVT, A/B etc) work very well together. Usability testing provides the user insights that make content testing even more effective.

One of the challenges of content testing is in test design i.e. what to test? While GA, Omniture etc can give you clues, they don't explain the WHY of user behaviour so running content tests without the WHY (which is what usability testing reveals) can be a shot in the dark. Oftentimes it's based on hunches - usability testing means you can remove the guesswork out of what to test.

almost 5 years ago

Chris Gibbins

Chris Gibbins, Director of User Experience & Optimisation at BiglightSmall Business Multi-user

Thanks for all your comments.
John, I totally agree. One of the most challenging aspects for us over the last couple of years has been in developing an inclusive Conversion Rate Optimisation process - one that accommodated all insights, including the rich insights and ideas from the user research and UX consultants.
And it's worth remembering that the MVT teams aren't only there to turn the knobs - they too have a wealth of valuable knowledge that should be brought to the table. Just got to make sure everyone’s sat at the same table!
Lee, yes the 'What to test' is so important, and usability testing, especially when combined with analytics (and other insights) and a structured MVT process makes testing far less random - and also less expensive.

almost 5 years ago

Chris Gibbins

Chris Gibbins, Director of User Experience & Optimisation at BiglightSmall Business Multi-user

Whilst we're on the topic of how to implement a successful CRO process, I thought I'd let everyone know that RedEye.com are hosting a webinar called "What Test Next?" on 6th October 2011.

almost 5 years ago

Avatar-blank-50x50

Craig Sullivan, Customer Experience Manager at Belron International

The answer is that you should use both.

Qualitative (diary studies, depth interviews, traditional tests, remote tests and session capture) are all harnessed alongside MVT and A/B testing in our toolkit.

The other people here nailed it correctly - the usability testing provides insight into the barriers, worries, fears and drivers for persuasion. The CRO tester then drives these inputs into test hypotheses. I've got a good slide on this I should share - where do my inputs from testing come from?

The one thing that UX can't do is account for the quantitative effects at large volumes, of say, the validation rules of a field.

A usability test of a postcode lookup system will tell you, for example, if most people can use the interface. It will not tell you if it validates the postcodes of most people correctly.

By using web analytics, session capture tools and complete user centered design lifecycles, we can use testing to give us an extra power up. A bit like the machines in Aliens - it's like wearing extra body armour when fighting for a good user experience that converts well.

Most of the problem these days is that a good user experience may not convert as well, for reasons that usability tests don't uncover. Is a usability test going to find a browser compatibility bug, for example?

Alas - mvt and a/b testing on their own will typically lack the direction and inputs it needs, without the qualitative insight to provide steering.

Lastly - none of the UX and testing can be harnessed without people, without talent and without will.

almost 5 years ago

Comment
No-profile-pic
Save or Cancel
Daily_pulse_signup_wide

Enjoying this article?

Get more just like this, delivered to your inbox.

Keep up to date with the latest analysis, inspiration and learning from the Econsultancy blog with our free Daily Pulse newsletter. Each weekday, you ll receive a hand-picked digest of the latest and greatest articles, as well as snippets of new market data, best practice guides and trends research.