The effects of Econsultancy's site change are still only just becoming known six months on.

This is what we have learned so far...

It has been a busy first half of 2015. For those who have been using us for a while may well remember we re-launched our website back in late November. Some of you still may not have forgiven us for this, some thought it was a massive improvement.

It has definitely been the cause of much comment.

Why we re-designed the site

The rationale behind the change initially came from aspects of our historical code base blocking development, being tired of the old site look and feel, and our move into Centaur Media’s portfolio, which is bringing about new thinking on the future of the service.

Customer experience is one of the cornerstones of our published modern marketing manifesto, so having a website that was not responsive (or device optimised), lacked usable functionality (such as site search) and didn’t offer the architecture required to portray the range of services available, was not a kind reflection of us ‘walking the walk’.

It provided the best first step to begin thinking how customers engage with touchpoints such as our events, training courses and bespoke consultancy products. A topic of follow up posts appearing shortly.

Deciding on what to change

Econsultancy staff, our contemporaries at Centaur Media, and our audience are never shy at giving an opinion, and trying to find the way through differing views was difficult.

We worked with Foresee to understand drivers to customer satisfaction with regard to our website, identifying a revised look and feel that would have the greatest effect on our initial goals. As Clare reported in her post back in November, a new personality was created, new functionality built and a platform with which to grow established.

So what happened?

Clare signed off her post suggesting this was a work in progress. In actuality, relatively little visible change has been completed outside of bug fixes and some business process amends.

Our partners at Foresee Results Ltd continued to record customer satisfaction metrics and our tracking mechanisms predominantly operated on quantitative insights (engagement metrics, funnels, conversion goals etc.) along with reporting of both positive and negative feedback. A large proportion of this feedback was actually generated as a response to Clare’s original post.

The site responsiveness was well received. Mobile devices have seen a 29% increase in sessions, a reduction in bounce rate and increased session duration. Feedback has also been very good. Our new search functionality demonstrated more engagement through reduced bounces and less exits.

However, it wasn’t long after launch that we started to report minor drops in our core quantitative metrics – lowered visit engagement, traffic volume drops (including issues attributable to SEO) and reductions in conversions - but nothing significant. As the site change had taken some time (!) we found that the business goals had also altered from the project's inception.

These were generating changes to the site which coincided with the delayed site launch, meaning the picture was murkier as to what was causing the results we were seeing.

So what have we learned?

  • Customer experience/product management must be considered at a strategic level with clear lines of responsibility across all business functions.

In practice, using the platform as a work in progress has been tough. Initial bugs were fixed, yet usability fixes fell into discussion loops and lower priority queues (note: as with our grey text!). Internal team changes made upholding the responsibility for customer experience difficult. The process highlighted the real need for customer champions and a joined up approach across functions.

  • Phase 2 never happens

Ambition must be reasonable. Make sure each project stage results in a well thought through end point for the customer and not what is convenient for the developers, project managers or senior management.

With us, the fact that many cooler aspects of the original design intention were left on the drawing board - culled into a ‘phase 2’ list – demonstrates over-ambition and we are the only people who know how great the site was planned to be. If scope is cut mid project, customers know nothing of this (and how good things might be later), they just have the results of what has been deployed, today, while the business works on what is next.

  • Showcase and test your changes with actual customers to iron out this stuff before you launch

I’m not even going to comment on this one. We all know it. We don’t always do it. We should.

  • Keep a handle on project scope and how you are going to reflect how well it is working

The inevitable late delivery of a site redesign will coincide with business structure alterations, it’s impossible for it not to. With us, our subscription business was changing and as such, these changes were all unavoidably launched at the same time. By allowing changes outside of the project scope that also affected customer experience made understanding the cause and effect even harder.

Approaching post-site reviews

We turned to customer experience analysis to shed some light on some of these issues.

We have continued working with Foresee in order to contrast our pre-site change results, using the same triggers and question sets.

The results received were exceptionally useful, determining that we still scored well in comparison to market benchmarks, yet we saw no specific increase in customer satisfaction scores. In fact, across some indicators, we were scoring worse.

Look and feel was still the number one aspect to review, which is frustrating for our development team after they spent so long putting the existing iteration together.

Alongside this analysis, individual customer journey analysis was undertaken with WhatUsersDo. This reflected the issues identified through Foresee, that the interface itself was problematic and that specific issues dogged most tested journeys.

We’ll look to pick apart these insights between what we hoped would happen and what actually happened in further blog posts, however an example of the confusion is detailed here:

Customer Experience – rocket science?

Maybe not, but it is very hard to get right. As we have demonstrated, an Econsultancy site redesign was an ambitious endeavour, and this was only to be a preparation to a much larger project. It's keeping customer experience front of mind across the business that has proven the toughest challenge over time.

The qualitative element of analytics have proven invaluable, as the actual site metrics have not massively changed. We have seen less engaged site visits (fewer views per page, less time spent on the site etc.) and a lot of these aspects could be explained away through the differences in journey between new and old.

It's only through having qualitative data that we can begin to understand the experiential aspects that have and have not worked, and most importantly why.

We’d be very interested in hearing your comments on this, after all it’s your experience that counts...

Ben Barrass

Published 5 August, 2015 by Ben Barrass @ Econsultancy

Ben Barrass is Head of Data and Analytics at Econsultancy/Centaur Media. You can connect with him on LinkedIn

6 more posts from this author

You might be interested in

Comments (5)

Pete Austin

Pete Austin, CINO at Fresh Relevance

Great article - I especially like "Phase 2 never happens" which is usually true, but people so rarely admit to. Protip: it happens less with startups (possibly because they have less technical debt) and is a major reason why they can overtake established competitors.

Obligatory link to the Project Management Tree Swing Cartoon, for those few people who haven't seen it.
http://www.tamingdata.com/wp-content/uploads/2010/07/tree-swing-project-management-large.png

over 2 years ago

Avatar-blank-50x50

Kevin Barnes, Head of Ecommerce Trading & Development at Office Shoes & Offspring

Refreshing to get such a candid report on this type of project, well done.
Did you MVT your 'NEW' site on desktop and mobile devices prior to making the changes?
And if not, will you consider MVT for future updates? It can give a huge amount of statistical insight and if you team up with a user recording service you can drill down to individual experiences- though this is of course extremely time consuming!

over 2 years ago

Tony Preedy

Tony Preedy, Marketing Director at Lakeland

Thank you for the candid article. Many of us have been through the same process and it's great to see such an honest appraisal.
Normalising KPIs for "before" and "after" when traffic trends by device type are in flux is extremely difficult. It makes detecting the signals in the noise very hard. My career to date has been spent trying to distinguish between causality and mere correlation, particularly when many use selection bias to favour data that 'proves' an argument while disregarding data that points in an other direction. Similarly, luck or chance are frequently mistaken for skill. I'm guided by the work of Nic Taleb & Nate Silver. Meanwhile, back to the task of making our business better and easier to use for our customers.

over 2 years ago

Avatar-blank-50x50

Deri Jones, CEO at SciVisum Ltd

Tony: nice to hear Taleb and Silver get a mention: too many eCommerce guys avoid the complexity/ bother to 'distinguish between causality and mere correlation' as you say!

over 2 years ago

Avatar-blank-50x50

Donna Akodu, Marketing Planning Executive at Chartered Institute of Personnel and Development

Thank you for your insightful post. A lot of food for thought here. Loving the honesty that's really refreshing.

over 2 years ago

Comment
No-profile-pic
Save or Cancel
Daily_pulse_signup_wide

Enjoying this article?

Get more just like this, delivered to your inbox.

Keep up to date with the latest analysis, inspiration and learning from the Econsultancy blog with our free Digital Pulse newsletter. You will receive a hand-picked digest of the latest and greatest articles, as well as snippets of new market data, best practice guides and trends research.