Ripples spread through the SEO community with the recent publication of a new Click Through Rate (CTR) study by an agency called Slingshot.

Why oh why is this important to anyone you may ask! Let me enlighten you… 

Ever since SEO was coined as a term, brands employing the services of agencies have asked what they are likely to get as a result of deploying it.

To answer this agencies in the most part used to focus on the first and most logical impact of SEO, rankings.

In fact in some cases a commercial model was built around it, whereby the agency only got remunerated from positions gained (typically first, second & third page, first to tenth, eleventh to 20th, etc… and in some cases top three results only).

In those days only a brave (and foolish) man would forecast. Back then it was the wild west of SEO! Search engine algorithms only had a couple of factors influencing them. Some even returned results alphabetically.

One day you could rank top using white text on a white background, the next you be pipped by someone keyword stuffing a cloaked & meta refreshed landing page.

Obviously as a transactional site, or even just a brochureware site, ranking visibility was just the front line of measurement of success and it became logical to look beyond visibility to the visits that came from this.

However, very few providers shared any relevant data to make forecasting possible. Two notable culprits were Wordtracker (still available today) and Overture, the latter returning paid search click data from its alliance with Yahoo.

It wasn’t until Google launched its keyword tool and traffic estimator as part of its Adwords offering that we got anywhere near accurate data to use. Now we finally had data the insight generated was suitably crude.

You could now pick keywords based on their ‘search volume’. However, there was virtually no comparison to search volume and actual traffic as measured by the site’s analytics.

One word of warning, and a common misapprehension of Google’s keyword tool data: its collected from its entire search inventory. That means that the returned counts include clicks on adverts from all round the web, not just search.

Then along came Google-powered AOL and in August 2006 it accidentally leaked millions of search records giving those who queried the data a unique insight into click data by rank (original data revealed here & Techcunch article from the time here).

Obviously, back in 2006 search results looked very different than they do now. It’s also worth pointing out the following: 1) users were much less internet savvy, 2) connections were much slower, and 3) AOL users were cretins!

Multiplying the search volume from Google’s keyword tool by the click through rate from the leaked AOL data suddenly gave us the ability to determine how much traffic you might get from any top ten position in Google.

The trouble is that no one had the ability to actually determine that these figures were still wildly out, as the only way to get impression data to work out the percentage click through rate from the traffic you were actually getting was through Pay Per Click, and that was like comparing apples and oranges and had numerous other factors as caveats.

Google Webmaster Tools resolves this issue.

So what impact does this CTR study have on me?

Take a look at the data comparison table from the four different ‘studies’ below. The (conditional formatting) colour scheme allows you to easily see where you were the highest click through rates were (green) and the lowest (red):

Click Through Rate Study Comparison

Quite a change between 2006 and 2011. What’s most amazing is the difference between the potential click throughs in total.

For the top three results, according to the Slingshot study, you can now only expect to receive 35.5% of the available traffic. This is as opposed to almost twice that (62.5%) in 2006.

Click Through Rate Study Summary

Equally impressive is the drop in total available searches for page one results (top ten). You can now only expect 52.4% of available searches on page one, as opposed to a whopping 89.6% in 2006!

Qualifying the study data

Now we are able to see impressions by keyword from Google Webmaster Tools we can determine the actual click through rate of your keywords. I’ve always been wisely sceptical, so to check this data out I conducted three real case studies.

We took impression data from Google Webmaster Tools, visit data from Google Analytics and ran our own ranking reports (using SEO Gadget’s Keyword Tool, Trackpal and Raven Tools).

The data is interesting, but first here’s some relevant background info on the sample set. All figures are from August. Client A is in the travel sector, Client B is an e-commerce retailer, and Client C is in utilities.

There are obviously a number of caveats to take into account when qualifying this data against your client’s,  least of all is the size of the data set and the stability of positional rankings. However for non-brand terms the click through rates were much higher than expected.

Click Through Rate Case Study

Now comes the interesting bit: forecasting. Knowing this data from our clients is it wise to anticipate such stonkingly good CTR from existing visibility?

I’d suggest not if you are embarking on a relationship with a client afresh, but if you are already engaged and through your work have achieved fixed top three positions to gain these CTRs then hell yeah, why not?

Forecasting just got easier but it’s still acumen and experience that allows you to anticipate your own SEO ranking capabilities that nails accuracy.

You can download the Google CTR study on the Slingshot website.