Often marketers complain that ‘IT’ are too inflexible, too difficult to work with, and don’t understand marketing. But marketers too can be guilty of not understanding, or appreciating, what they deem ‘IT’.
A website’s technical infrastructure is a case in point. I believe it is a marketing issue, and marketing responsibility, not an ‘IT’ one.
Are you long enough in the internet tooth to remember when Google first came on the scene? I remember the day I switched from Alta Vista as my default search engine to using Google.
And I remember why I switched.
Was it for the much-talked-about relevancy and accuracy of search results?
No. It was for two reasons. Firstly, the interface. The simple search box, with no ads and other stuff cluttering the page (as was true of Alta Vista). Secondly, it was the speed of the search results.
To this day Google still display how long a search query has taken in the search results. So do Yahoo! now. So now do MSN.
(Indeed, most people think that Google’s search algorithm is their biggest barrier to competition. I don’t think so. I think their sheer computing power, and the speed that they can process the huge volumes of data they are dealing with, is their real USP. We’ve all heard of Google search engineers, even met them. But how many Google network engineers do you know? I imagine they are VERY closely guarded secrets.)
Perhaps the difference in experience is somewhat less now that we’re mostly broadband-enabled, but back then speed was a real differentiator. And it still is – certainly for narrowband users, but even for us impatient web surfers with attention spans of milliseconds.
I’m sure we can agree then that something as apparently basic as site speed is hugely important to the user experience. But can we quantify this in any way? Can we put a ‘dollar value’ to it?
Earlier this year, we began noticing on E-consultancy.com that the site was grinding to a near halt at times of peak load e.g. when Googlebot was spidering our site like a lunatic, or when we’d just sent an e-mail to 30,000 users. So we thought it was probably time we improved our load balancing, server configuration, caching and so on to improve performance.
Clearly this had a cost attached to it. So we wanted to try and understand a before and after return on investment for investing in ‘site speed’.
We used SiteConfidence (http://www.siteconfidence.co.uk/) as a tool to benchmark our site speeds (as experienced by users) before and after the infrastructure upgrades. Interestingly, at the time, I remember talking to the guys there and they were explaining how they were struggling to sell their service to marketing or commercial people (who they felt it was most applicable to) and were always sent off to ‘IT’ because it was an ‘IT issue’.
So here’s some really simple metrics we looked at:
1. Number of new visitors referred to the site from Google before and after the changeover
2. Our site bounce rate (% of visits with only 1 page view)
3.The average number of page views / session
Why should these metrics (clearly business / marketing metrics) change as a result of the site’s speed? Well, because Google’s bot indexed deeper and wider and more frequently with our higher performing site. Which meant in turn we got more pages indexed, were found on more searches, and so got more new traffic. And, for human users, the improved speed meant our bounce rate dropped and the average page views / session increased.
Taking these factors together, our technical upgrade resulted in an almost 20% increase in page views and 5% increase in subscriptions. It easily paid for itself.
So perhaps marketing and commercial people should take a closer look at their ‘IT’ set up?
Imagine going to the Ad Sales Director of a large online publisher who was selling out of inventory (a few of them around again these days)… you could say: “Do you want a load-balanced environment with edge serving and improved caching to improve site speed?”. Or, you could say, “Do you fancy 20% more inventory to sell?”.
Without wanting to be accused of pointing out the failings of others, particularly anyone who might be considered competitive to us, if you take a look at Netimperative.com’s site traffic figures (I assume they realise these are public?), they appear to have been falling dramatically over the year. As a user of their site I know this is nothing to do with the quality of the content and everything to do with the deteriorating site speed. If that graph isn’t a business / marketing issue then I don’t know what is.