Here are a few highlights from the survey of nearly 900 cient-side and supply-side marketers.
What methods are used to improve conversion rates?
Multivariate testing (MVT) continues to be the method bringing the most significant improvement and increased satisfaction rates, consistent with last year’s results.
However, multivariate testing is still struggling for adoption, currently used by only 30% of respondents.
A/B testing is unsurprisingly at the top of the pile, currently used by well over half of respondents.
Customer feedback also features prominently, something which Dan Barker, independent consultant, finds encouraging.
Customer feedback is the heartening point here: it’s ranked almost as highly as A/B testing as a method companies use to understand and improve their results.
In the past sometimes optimization was seen as almost synonymous with A/B testing, and it’s therefore good to see other tactics have grown to similar levels of use.
Customer feedback is also positive as, unlike A/B testing, it can capture the one or two really unhappy VIP customers and allow you to act on them, as well as the broad-brush ‘aggregate level’ results A/B testing allows.
Which of the following methods do you currently use to improve conversion rates? (Company respondents = 449)
What do companies test?
The chart below shows some interesting trends in testing. A large majority of respondents are testing those factors that are perhaps easiest to test (calls to action – 83%, page layout – 77% and copy – 72%).
Other areas that have a large impact on conversion, such as checkout process (46%) are still being tested by less than half of respondents. We may speculate this is due to limitations of technology (CMS, third party testing tool or in-house tech).
Only 32% of respondents are testing site search, an area that Suniel Curtis, Head of Analytics at Hays, thinks could offer competitive advantage.
It is understandable that the complex area of search functionality is only tested by a third of the respondents and I understand not everyone has search.
With the vast choice consumers now have, optimising search and filtering is becoming a big issue so I see this rising as an area to test.
A further section of the report looks at the number of tests run per month and finds the optimum number (for clients’ perception of both ‘satisfaction’ and improvement in conversion rate) is three to five. Anything more or less and improvement falls.
As was the case in last year’s survey, there’s evidence that improvement and satisfaction increase as the number of testing methods used increases.
However, even among companies using more than seven methods, only 41% are satisfied with conversion rates.
Specifically for your website, what do you test? (2015 company respondents = 335)
Which methods are easiest to implement?
Website personalization and multivariate testing top the pile for ‘difficulty to implement’.
Of course, personalization is a broad church, but it’s perhaps no coincidence that two of the lesser seen methods of improving conversion rate (see chart above) are deemed difficult to implement.
Though few find usability testing ‘very difficult to implement’ (8%), it’s surprising to see 44% deem the tactic as ‘quite difficult’, given how mature this discipline is.
A further question in the survey asked about team structure and investment in conversion rate optimization.
The results show that organizations with dedicated resource are 16-20% more likely to see improvements in conversion rates.
Of course, there are a lot of qualifiers here, but this is a reassuring correlation all the same, given not all those investing in CRO have seen an uplift.
How difficult is it to implement the following methods for improving conversion rates? (Company respondents = 207)
For more on conversion strategies and tactics organizations are using, in addition to the tools, processes and resource employed for improving conversion rates, download the seventh Conversion Rate Optimization report, in association with RedEye.