It has been a busy first half of 2015. For those who have been using us for a while may well remember we re-launched our website back in late November. Some of you still may not have forgiven us for this, some thought it was a massive improvement.

It has definitely been the cause of much comment.

Why we re-designed the site

The rationale behind the change initially came from aspects of our historical code base blocking development, being tired of the old site look and feel, and our move into Centaur Media’s portfolio, which is bringing about new thinking on the future of the service.

Customer experience is one of the cornerstones of our published modern marketing manifesto, so having a website that was not responsive (or device optimised), lacked usable functionality (such as site search) and didn’t offer the architecture required to portray the range of services available, was not a kind reflection of us ‘walking the walk’.

It provided the best first step to begin thinking how customers engage with touchpoints such as our events, training courses and bespoke consultancy products. A topic of follow up posts appearing shortly.

Deciding on what to change

Econsultancy staff, our contemporaries at Centaur Media, and our audience are never shy at giving an opinion, and trying to find the way through differing views was difficult.

We worked with Foresee to understand drivers to customer satisfaction with regard to our website, identifying a revised look and feel that would have the greatest effect on our initial goals. As Clare reported in her post back in November, a new personality was created, new functionality built and a platform with which to grow established.

So what happened?

Clare signed off her post suggesting this was a work in progress. In actuality, relatively little visible change has been completed outside of bug fixes and some business process amends.

Our partners at Foresee Results Ltd continued to record customer satisfaction metrics and our tracking mechanisms predominantly operated on quantitative insights (engagement metrics, funnels, conversion goals etc.) along with reporting of both positive and negative feedback. A large proportion of this feedback was actually generated as a response to Clare’s original post.

The site responsiveness was well received. Mobile devices have seen a 29% increase in sessions, a reduction in bounce rate and increased session duration. Feedback has also been very good. Our new search functionality demonstrated more engagement through reduced bounces and less exits.

However, it wasn’t long after launch that we started to report minor drops in our core quantitative metrics – lowered visit engagement, traffic volume drops (including issues attributable to SEO) and reductions in conversions – but nothing significant. As the site change had taken some time (!) we found that the business goals had also altered from the project’s inception.

These were generating changes to the site which coincided with the delayed site launch, meaning the picture was murkier as to what was causing the results we were seeing.

So what have we learned?

  • Customer experience/product management must be considered at a strategic level with clear lines of responsibility across all business functions.

In practice, using the platform as a work in progress has been tough. Initial bugs were fixed, yet usability fixes fell into discussion loops and lower priority queues (note: as with our grey text!). Internal team changes made upholding the responsibility for customer experience difficult. The process highlighted the real need for customer champions and a joined up approach across functions.

  • Phase 2 never happens

Ambition must be reasonable. Make sure each project stage results in a well thought through end point for the customer and not what is convenient for the developers, project managers or senior management.

With us, the fact that many cooler aspects of the original design intention were left on the drawing board – culled into a ‘phase 2’ list – demonstrates over-ambition and we are the only people who know how great the site was planned to be. If scope is cut mid project, customers know nothing of this (and how good things might be later), they just have the results of what has been deployed, today, while the business works on what is next.

  • Showcase and test your changes with actual customers to iron out this stuff before you launch

I’m not even going to comment on this one. We all know it. We don’t always do it. We should.

  • Keep a handle on project scope and how you are going to reflect how well it is working

The inevitable late delivery of a site redesign will coincide with business structure alterations, it’s impossible for it not to. With us, our subscription business was changing and as such, these changes were all unavoidably launched at the same time. By allowing changes outside of the project scope that also affected customer experience made understanding the cause and effect even harder.

Approaching post-site reviews

We turned to customer experience analysis to shed some light on some of these issues.

We have continued working with Foresee in order to contrast our pre-site change results, using the same triggers and question sets.

The results received were exceptionally useful, determining that we still scored well in comparison to market benchmarks, yet we saw no specific increase in customer satisfaction scores. In fact, across some indicators, we were scoring worse.

Look and feel was still the number one aspect to review, which is frustrating for our development team after they spent so long putting the existing iteration together.

Alongside this analysis, individual customer journey analysis was undertaken with WhatUsersDo. This reflected the issues identified through Foresee, that the interface itself was problematic and that specific issues dogged most tested journeys.

We’ll look to pick apart these insights between what we hoped would happen and what actually happened in further blog posts, however an example of the confusion is detailed here:

Customer Experience – rocket science?

Maybe not, but it is very hard to get right. As we have demonstrated, an Econsultancy site redesign was an ambitious endeavour, and this was only to be a preparation to a much larger project. It’s keeping customer experience front of mind across the business that has proven the toughest challenge over time.

The qualitative element of analytics have proven invaluable, as the actual site metrics have not massively changed. We have seen less engaged site visits (fewer views per page, less time spent on the site etc.) and a lot of these aspects could be explained away through the differences in journey between new and old.

It’s only through having qualitative data that we can begin to understand the experiential aspects that have and have not worked, and most importantly why.

We’d be very interested in hearing your comments on this, after all it’s your experience that counts…