{{ searchResult.published_at | date:'d MMMM yyyy' }}

Loading ...
Loading ...

Enter a search term such as “mobile analytics” or browse our content using the filters above.

No_results

That’s not only a poor Scrabble score but we also couldn’t find any results matching “”.
Check your spelling or try broadening your search.

Logo_distressed

Sorry about this, there is a problem with our search at the moment.
Please try again later.

This week I'm writing for the SME audience on Econsultancy, a sweet homage to the much debated yet scarcely implemented topic of website testing, something which should be an integral part of your strategy, the testing plan being the culmination of analysis of web data and voice-of-customer. 

If you are not yet testing and are spending money on marketing/advertising, then read this post and do something about it. Testing is not the preserve of major brands with big budgets, it’s the mental gap you need to cross not the financial one.

Smack in the face with numbers

Amazon tested a form with two fields, two buttons and one link. Couldn't be simpler could it? The form was preventing customers from buying. After investigation the designers changed one button from 'register' to 'continue'. 

The results: the number of customers purchasing went up by 45%, in 12 months this generated an additional $300m. Yes I said $300m. Don't believe me, read Jared Spool's article on User Interface Engineering.

Why should I test?

How does anybody know the best possible page layout and design to cater for the needs of thousands of different visitors? Unless you have a zero bounce rate and 100% click through, your page can be further optimised.

Plus personal opinion and politics always rear their ugly heads. With the best intentions, people often second guess what customers want because they “know the business”.

If your finance bods are tightening the purse strings (after all it's still not rosy out there in the economy) then give them the basic equation. Take your worst performing page that has high visitor numbers then project the revenue uplift of conversion increases of 10%, 20%, 30% etc. Show them the numbers.

How should I test?

Knowing what to test requires analysis and the understanding of relevance. You might have a page with 100% bounce, should you automatically invest in testing to improve? Not necessarily. What if that page has only two visitors per month? Even with 100% conversion the revenue return is minimal. What if that page is a static store information page with contact phone number? Bounce could be a good thing as you are directing people to the store.

So you need to find relevant pages. Build an engagement matrix for all site pages - look at visits, time on page, bounce rate, click through, conversion, revenue, average order value etc. Then filter with a relevancy formula:

Traffic * Bounce * Conversion * Average order value

Pick the pages that are losing you the most customers/money. Put these at the top of your testing plan.

If you haven't already got a customer survey running, add it quickly. Surveygizmo is an excellent low cost solution with a Pro license at $49 per month. You can create sophisticated surveys with inbuilt logic and add them to your website as well as emailing your database. There are other tools out there but this has the best recommendations (endorsed by @OptimiseOrDie at the Econsultancy Online Marketing Masterclass event in London). I've used it myself - it's intuitive and easy to publish surveys quickly.

Where do I start and what do I use?

Start with simple A/B testing. Your data analysis has told you which pages are losing you the most business. Your customer survey will highlight some of the potential reasons. Now create some hypotheses for testing.

e.g. customers said they don't like the registration form on the checkout, it's too complicated and the bounce rate from this page is 60%

  • Create two new versions and test against the current version:
  • Version one: clearer signposting & help information.
  • Version two: data capture fields in two columns instead of a scrolling page.

You don't have to spend thousands doing this. Follow these six Simple steps (even I've managed this!):

  1. Set-up a Google Website Optimizer account for free.
  2. Generate the testing code yourself.
  3. Give this to your developers to add to the page.
  4. Get your designer to create the test versions of the page.
  5. Tell Optimizer what the URLs are.
  6. Let the testing begin.

As you learn more, you can increase the sophistication of testing and embrace MVT (multivariate testing). A/B tests different versions of a page against each other, MVT tests multiple variations of elements within a page. MVT is ideal for pages with multiple elements such as text links, buttons, images, javascript functions etc.

Reports will show you which page is performing the best. You can then implement the winning page. The testing doesn't stop there, you can then create a new test for the same page and re-run the fun.

Summary

  • Testing is a simple concept - you don't have to rely on an analytics specialist or agency
  • With simple steps you can do this in-house to prove the business case
  • Testing is an ongoing process – unless you’ve got 100% conversion, there is room for improvement
  • Focus on pages that can deliver the greatest financial uplift – that will keep the finance bods content
  • Use online forums, Google help centre & official Google Website Optimizer blog to get the basic facts and learn from others.

Don’t bite off more than you can chew

Analytics and testing experts are worth their weight in gold. However, many businesses struggle to take the first steps because they can’t afford to pay a specialist, you can implement simple A/B and MVT testing yourself to prove the business case.

Once the business case is established and your Board is behind you, it then often pays to bring in the specialists to manage a more complex MVT program, integrating this with your analytics tools to drive customer insight and commercial value.

Good luck!

James Gurd

Published 23 November, 2009 by James Gurd

James Gurd is Owner of Digital Juggler, an ecommerce and digital marketing consultancy, and a contributor to Econsultancy.He can be found on on Twitter,  LinkedIn and Google+.

49 more posts from this author

Comments (14)

Comment
No-profile-pic
Save or Cancel
Avatar-blank-50x50

Mark Bolitho, New Business Director - Ecommerce at more2

Excellent article, James - we've been banging this drum for some time now.

Just one thing I'd like to add: a business will never convert 100% of visitors, and there's likely to be a global maximum for the sector and so any cost/benefit analysis should take this into account.

The theory though, is a good one, although I speak to people who run tests but who still have howlers on their sites so it's clear that the initial analysis and, most importantly, interpretation of the stats or situation is the key to success.

about 7 years ago

James Gurd

James Gurd, Owner at Digital JugglerSmall Business Multi-user

Mike - shameless marketing plug - the article is about website optimisation in relation to A/B and MVT testing, not load times, concurrent users and application stability. Please try to stay on topic.

Mark - thanks for the feedback. Yes agree that the chances of converting 100% of visitors are so remote it in not a realistic target. I just wanted to illustrate the point that everybody has an opportunity to optimise their website further. Amazon doesn't stop just because it has had a success.

Benchmarks are good when they are relevant to your brand, products and customers - I'm always wary of comparing yourself against an industry average that can be misleading.

For example, UK retail conversion (visit-to-order) averages around 2 to 3% - however, a premium brand offering products that are £400 + with an extensive store network is unlikely to achieve that high a conversion online, whereas a high street fashion brand is likely to achieve considerably more.

As with any change to a site that impacts your customers, you are right that analysis and interpretation are key.

Thanks

james

about 7 years ago

Avatar-blank-50x50

Mark Bolitho, New Business Director - Ecommerce at more2

Hi James

You're right about benchmarks - whilst the Internet is a great leveller in some respects, the gravitas of one brand will mean it will most likely out-perform a lesser known shop - and this is just one of the multiple influences on conversion rate.

But this isn't to say that lesser known brands / companies shouldn't bother to test - quite the opposite. And, the beauty of this methodology is that knowing when to stop becomes less of an issue if using a free tool, some best-practice theory and keeping opportunity cost to a minimum.

We've found that things are incredibly contextual, and I'd warn smaller outfits of the perils of mimickery - how many times have we seen the Amazon buttons crop up? What a big psychological turn-off this can be for customers - contrary to the aim of the exercise.

As with all things web, it's a learning curve and the article should serve as a good spur to get people on it - testing is the future so start now and get used to thinking this way.

about 7 years ago

James Gurd

James Gurd, Owner at Digital JugglerSmall Business Multi-user

Hi Mark

Yes agree that context is important - that's why i can't stand the term "best practise". There is good practise and there is relevance and each website owner needs to find what these are for their unique audience,

Thanks for the comment, useful additions.

My only worry with articles like this is those not doing it think,"that sounds great" but then take no action and return to the status quo. I really hope it encourages at least 1 person to invest time in testing. I live in hope!

James

about 7 years ago

Avatar-blank-50x50

Mark Bolitho, New Business Director - Ecommerce at more2

Hi James

I think there's theoretical best practice, in so much as it's pretty widely documented these days that a cumbersome, long-winded checkout process isn't good, for example, or that navigation and filtering and calls to action should be relevant and clear.

It is, of course, ultimately the testing methodologies you outline that are the key to uncovering the exact contextual best practice.

The thinking will change, testing will become the new SEO over the next couple of years. There will be some that do it well, and many who will not.

It's great to see the Internet evolving - I wonder what will be next?

about 7 years ago

James Gurd

James Gurd, Owner at Digital JugglerSmall Business Multi-user

Hi Mark

I genuinely hope that testing takes more prominence in eCommerce budgets. I also hope that it doesn't become the media circus that is social media where everyone is suddenly an expert.

I've met a lot of people who claim to be testing/analytics experts but actually only one of them has the commercial acumen to understand that you have to have a business case and prove the ROI of the investment which means think big but start small.

If people spend money on what is practical instead of a massive consultative process with the only output being a document, then I think we will see more uptake.

Thanks for the discussion.

james

about 7 years ago

Avatar-blank-50x50

Mark Bolitho, New Business Director - Ecommerce at more2

Hi James

Yep, me too.

it's early days for us as a company and our new platform but it's something I pursue with all our clients after they've been live for 6 months or so

It funny actually, as it's one of the main reasons why most of our clients chose us yet only a small % of them have taken up the conversion testing after sales service. The reason for this is budget 9 times out of 10, so I think we've got the message across regarding the theory and potential.

As most of our clients have come from other providers their budget has been focused on the new platform itself, and the testing element is something they've vowed to revisit once the re-platforming is done and the dust settled.

It's paid dividends for those that have which fuels our argument, and the others tell me they WILL be on it next year as they realise they now need to work into their marketing budget to try and gain an edge and make the marketing itself more effective.

We too, live in hope!

Cheers,

Mark.

about 7 years ago

Avatar-blank-50x50

dan barker

hi, Mark, how are you doing?

might be a silly suggestion, but why not add a series of post-launch tests as a default option in your replatforming package?

probably easier for a client to build that into their budget than have to go back a second time immediately after replatform for 'yet more' money?

dan

about 7 years ago

Avatar-blank-50x50

Mark Bolitho, New Business Director - Ecommerce at more2

Hi Dan, very well, thanks.

We considered that, but like to keep it modular and give people the choice - nothing is compulsory here (well not much, anyway)!

In our experience it's best to let things settle for at least 3 months anyway, and clients certainly wouldn't want to pay that far up-front.

As i said, we are finding that the take-up is happening. As James points out in his original post the battle is first of all with changing the thinking to a testing mentality - the rest will follow in my view. Once people are sold on an idea they make moves to find the financial resource.

We've got the point across, and we're expecting next year to be a bumper one for testing.

Cheers.

about 7 years ago

Mark Patron

Mark Patron, Consultant and non-exec director at Patron Direct LtdEnterprise

Hi James,

Good article. More testing is probably ther best investment online marketers can make. Just look at Amazon. The "Conversion Report" research we did  with Econsultancy produced empirical evidence that more tresting means better results.

Thanks for the article

Mark

about 7 years ago

James Gurd

James Gurd, Owner at Digital JugglerSmall Business Multi-user

Hi Mark

Most welcome - I like a quote that Joel from Dobbies gave me on Edison:

james

about 7 years ago

Avatar-blank-50x50

Mark Bolitho, New Business Director - Ecommerce at more2

Hi James

Now that's a whole new subject...the client perception is most often that a test should bring a positive result, as opposed to a negative one.

It takes something to convince them their money has been well spent in ruling something out, especially when the hypothesis for the test has come from the likes of you or I!

This has to be an open and up-front discussion ahead of embarking on any testing strategy, and a good, close working relationship with the client will be necessary. Managing expectations of testing is important - most won't be 'big' tests that will make them rich overnight, but improvement in small increments, sometimes with even a negative result.

If every test wen't like the Amazon one you cite in your original post we wouldn't need to be having this discussion!

Mark.

about 7 years ago

James Gurd

James Gurd, Owner at Digital JugglerSmall Business Multi-user

Yep agree - setting expectations and not whipping people up into a frenzy of expectation for the gold rush is essential. That is where the experience and integrity of the person making the recommendations comes into play.

Thanks

james

about 7 years ago

Avatar-blank-50x50

One IT - NZ Cloud Hosting

What a great post I didn't even think to do tests to make sure customers know what their supposed to be doing.

almost 6 years ago

Comment
No-profile-pic
Save or Cancel
Daily_pulse_signup_wide

Enjoying this article?

Get more just like this, delivered to your inbox.

Keep up to date with the latest analysis, inspiration and learning from the Econsultancy blog with our free Daily Pulse newsletter. Each weekday, you ll receive a hand-picked digest of the latest and greatest articles, as well as snippets of new market data, best practice guides and trends research.