Sarah Chambers is Site Operations & Development Manager at fashion retailer Radley and Co, focused on optimising the website to improve customer experience and conversion rates.

Her role involves overseeing a data-driven approach to website optimisation that combines analytics, usability, design and testing. We asked her a number of questions about her approach, and the company’s work with RedEye to improve conversion rates.  

The latest Econsultancy / RedEye Conversion Rate Optimization Report has found that companies are more likely to succeed if they have a structured approach to this discipline, with someone responsible for making this happen. What processes or frameworks do you have in place to optimise online performance?

Structure is definitely an important factor when it comes to how we improve conversion here at Radley.

When it comes to improving conversion there is so much to do and it is difficult to know where to start. We follow a strict strategy of optimisation, consisting of defining goals, conducting research, prioritising what to test (producing a testing roadmap), testing and, lastly, analysis for on-going improvement. The customer experience is extremely important to us (if we can get this right everything else will fall into place) so everything is centred around the user experience.

Having a structured ‘UX-driven’ approach to CRO not only ensures we continually focus on the customer, but has helped us stick to the testing roadmap and carry out the right A/B and multivariate tests on our website.  

Can you provide a little more detail about this approach?

Our structured approach consists of:

a) Investigating analytics, ad hoc usability reviews and carrying out usability testing with eye tracking every 12 months.

b) Recording all insights and ideas in a single dashboard. This is very important as it ensures we don’t lose them; they’re easy to reference and helps us effectively prioritise testing.  

Many of our ideas come from the insights in first step. Others come from best practice examples we come across or read about; other sources come from the customer services team who talk to the customers every day. The key is holding everything in once place for one overall analysis.

c) We run a ‘what to test’ workshop every quarter, hosted by our agency, where we review latest test results, research insights and discuss and prioritise the exciting new test ideas.

d) The agencies UX/UI designers produce the designs and present these in a test plan along with the hypothesis and the conversion goals etc.

We give approval and they implement and launch the tests. It’s really important to us that all test variations are designed by experts who understand UX Design and usability. This ensures our test ideas are given the best chance of succeeding.

e) Test results are presented to the business stakeholders where we discuss how much improvement has been achieved (uplift and ROI). The most important conversion metrics for us are the conversion rate, average order value and revenue per visitor but we also look at micro conversions like engagement as how our customers interact with us is equally important.

f) Immediately following test completion we launch the next test and continue the process.

Our UX-driven approach to conversion rate optimisation has really worked for us at Radley because we’re such a customer focused company.

Having always been an advocate of user research as well as analytics, it was vital that when it came to A/B and Multivariate testing we also took a user-centric approach.

It’s my responsibility to make sure our website provides the best possible customer experience. You wouldn’t be happy to know a customer received a bad experience in a traditional bricks and mortar store. A bad experience will no doubt cost you the customer and any future sales.  Why should online be any different?

Are there any particular technologies or processes which have proved especially successful?

We have used other tools in the past, but the Optimizely testing tool has been the easiest to use and the one we intend to stick to for the time being.

Eye tracking in the usability testing sessions was extremely useful for seeing what our users were not seeing. For example this helped us to understand which information on the product pages needed greater prominence.

We then carried out a series of product page A/B/n tests where our agency (RedEye) completely redesigned the product pages and tested these out using the testing tool. We achieved a double-digit increase in conversion rate from this series of tests.

We learn a lot from all our A/B and multivariate tests in a similar way to how we learn from usability testing or surveys. And we have used all these findings to inform the recent replatforming and site redesign project and to help ensure we provide the best possible online experience for our customers.

Of course, the other extremely important factor to success is having a dedicated agency to implement all the tests. The right technology is important; finding the right people to manage the process is essential.

As well as the expertise we receive we are also able to test without having to involve the IT/web dev team. They have enough to do already but this allows them to reduce costs and risk whilst optimising the site! They get involved only once we’ve proven the concept.

Can you talk more about the usability testing work you have done, and what actions have resulted from that?

It all started with the first round of usability testing with eye tracking which we carried out in 2010 at our agencies’ research labs. Analytics data had already highlighted where pain points were on the site, but we had no real evidence to explain ‘why’ certain pages were causing users problems.

Going straight into A/B or multivariate testing at that point would have been like taking a shot in the dark with what to test; and no doubt would have cost us more time and money. Doing the usability work first we could experience first-hand where users were struggling. Often, it was with elements we just took for granted.

Doing the work we found out so many things about our website. However probably the most valuable outcome wasn’t just the list of findings and recommendations – it was the effect it had on the business stakeholders who watched first-hand how their customers shopped on the Radley website.

It was a real eye-opening experience for everyone. And as the right people were in the room and got to see all the highlight clips, we got the buy-in we needed for the whole conversion rate optimisation process.

What work have you done to reduce checkout abandonment?

Funnily enough, following the usability testing, the first major piece of work we did was a checkout redesign using a UCD [user-centred design] approach. To do this we worked with RedEye to create prototypes for a number of different variables. These prototypes included advanced JavaScript software to fully replicate each proposed user interface.

RedEye then subjected the prototypes to further usability testing in order to make an informed decision about what changes would have the greatest impact on user experience and checkout conversion. Following the testing sessions, final amendments were made to the prototypes. 

This in-depth level of user testing ensured we were confident about what changes we were making. It can be scary when changing something as crucial as the checkout process. Integrating user testing into the development process was definitely vital to the success of the project.

RedEye was always on hand to offer advice and our successful collaboration ensured that we maintained a consistent user experience in these key functional areas of the site even though the look and feel had a radical change. 

How do you approach split- and multivariate testing? Has this been effective? 

Again, this is all part of our structured process. With all findings (including usability testing) held in one place we can assess exactly what needs to be tested and how.

We produced a full testing plan which allows us to use the results from each test effectively to plan the next test. We also hold workshops every quarter to discuss and prioritise what to test next.

To what lengths do you go to optimise for smartphones and tablets?

We have recently reworked our website to use responsive templates with a view to provide an optimal viewing experience across devices. We plan to optimise this further in 2013.

Our recent How the Internet Can Save the High Street report suggested that retailers could do more to use digital to help increase bricks and mortar revenues. How does Radley ensure that online complements offline and vice-versa?

This is definitely important and something we plan to focus more on in the future. 

To what extent do you think retailers can get a single view of the customer, across online and offline activity?

The only answer to this is the integration of data, and a structured marketing strategy. Just integrating all online activity (web analytics, usability, A/B/MVT in one place) has had a huge impact on how we can communicate to users online. Add offline to the mix and your strategy is likely to improve even further.

It is very likely many of our online customers are also offline customers. However, a multichannel customer should be treated differently to single channel customer, and vice-versa.

But the only way to know this for sure is through full data collection and analysis. We are already integrating offline research such as feedback from our customer services team into our findings when planning testing and it has proved to be very worthwhile.

What single piece of advice would you give to other online retailers looking to improve website conversion?

Test Test Test! If you’re not improving you’re falling behind. Let the customer define the experiences that work for them and use your findings to inform, influence and unite all customer experience based decisions.