A few weeks have passed since Google’s long awaited and much speculated Penguin 2.0 update, and with the dust beginning to settle, we took a look at its impact in the UK.
There’s been no shortage of hype in the run-up to Penguin 2.0, with everybody’s favourite Google spokesperson and distinguished engineer Matt Cutts describing the forthcoming update as ‘a big one’ back in March.
But, so far at least, has it lived up to its billing as Google’s most advanced piece of spam-fighting technology to date?
The winners and losers
To begin to answer that question, a logical starting point is to look at those who have suffered heavily as a result of the update, and those who have enjoyed an altogether more positive experience.
While all of Google’s major updates in recent years have come in a variety of flavours with a whole host of different objectives, the end result has followed a common theme.
An abundance of smaller, lesser known web sites in the ‘losers’ column, and a series of household names and other more significant sites under the ‘winners’ banner.
The immediate aftermath of Penguin 2.0 however looks decidedly different. First of all, let’s take a look at the 25 biggest winners (see the Appendix for our methodology).
Here, we’re looking at the sites with the biggest positive change in click volume alongside each site’s Alexa Rank; a crude but indicative measure of the stature of the site in question (the lower the Alexa Rank, the more highly trafficked the site).
While some well-known names appear in the list; Teletext Holidays, Holiday Hypermarket, Tesco Bank et al, it is also punctuated by a series of overtly spammy web sites and illegal movie portals (all of which are highlighted in yellow).
With almost a third of the winning sites falling under such a banner, it gives an unusual complexion to update’s fallout.
This peculiar outlook is all the more compounded by an anecdotal check of those who appear to have lost out.
The losers list is not without the spam sites that defined the winners, however some of the most spectacular fallers are major names. Icelolly.com has been heavily affected, losing its number one berth on major terms like ‘cheap holidays’ and suffered notably for the likes of ‘low cost holidays’.
CompareTheMarket meanwhile lost ground for an array of major insurance and financial terms such as ‘compare car insurance’, ‘travel insurance’ and ‘loans’.
The update’s overriding characteristic is that the big sites appear to have fared worse in many cases than the spammers. This point is underlined by comparing the Alexa Ranks of the biggest winners and losers.
Evidently, there is a huge gulf between winners and losers, with the winners, on average, appearing to be of a significantly lower caliber than the losers. This is a major contradiction to major Google updates of the past.
So, has Google got it wrong?
Has the update had the reverse effect?
Given our findings, the conclusion that Penguin 2.0 has had the opposite effect to the one Google intended would be an easy one to arrive at. In reality, that would be somewhat wide of the mark.
The overtly spammy, often out-and-out black hat sites that litter the ‘winners’ list are not entirely new. The specific domains in question might well be appearing for the first time, but the tactics used to place them there are not.
Some very high profile search results have been riddled with such sites for months; and whilst some new contenders may have arrived in the midst of Penguin’s rollout, there is nothing to definitively suggest that they arrived as a direct result of any new algorithmic change.
With all that said, whilst the Penguin 2.0 might not be guilty of inadvertently assisting out-and-out spam sites, it certainly cannot claim to have dealt with the situation sufficiently.
A search for ’12 month loans’, a term which according to Google carries more than 27,000 monthly searches, sums this up well.
10 results, and not one of them a perceivably legitimate web site. Unfortunately, to Google’s undoubted frustration, this is not an isolated case with hundreds of other high-profile SERPs including ‘online casino’ and the now infamous ‘payday loans’ experiencing varying degrees of the problem.
In effect, whilst the update appears to have tackled a degree of legitimate web sites engaging in not so legitimate SEO tactics, it has as yet been staggeringly unsuccessful in taking down the very worst offenders.
A clue for the reason behind this may be found in the tactics used by such sites. Rather than relying upon the link networks Google has been no doubt cracking down on with months’ worth of data from its link disavow tool, most appear to utilise a variety of CMS exploits to secretly place links on the sites of unsuspecting webmasters.
Does this represent victory for the spammers? Temporarily, perhaps, but history tells us that Google will continue to refine its approach until such a time it more closely reflects its ideal. The spammers may have won the battle, but the war is unlikely to be over.
And so, with further, deeper reaching iterations of the next generation Penguin only to be expected, what can we learn from this initial update? To answer that, let’s put the pure spam to one side and take a look at the true characteristics of Penguin 2.0.
What’s making this Penguin tick?
The presence of spam in Google’s results should not mask the key traits of the updates. Understanding these may help to understand the factors likely to be refined and amplified in the coming months. And critically, understanding those will help to establish whether any evasive action is required for your own site before the next version strikes.
First and foremost, taking the sites that (for want of a better description) shouldn’t be ranking prominently in high-profile search results out of the equation, the outlook on the Alexa Rank front changes substantially.
These numbers look far more consistent with major Google updates of the past, with sites of an apparently higher caliber prevailing at the expense of less established domains.
But in terms of more tangible quality signals, what has changed in the Penguin 2.0 shake-up?
While Google’s interpretation of links, and therefore popular link building strategies, may have changed beyond recognition in recent years, links on the whole remain an integral part of the Google algorithm.
According to our Roadmap data, all of the top 10 signals which display the strongest correlation to top rankings are link related.
This latest update has not greatly altered the complexion of key ranking signals, however the biggest emerging signal comes in the form of ‘linking root domains’ – rising 10 positions on our signal significance chart.
The graph below outlines the relationship between a site’s volume of linking root domains and its ranking within the top 20 positions of Google’s search results following the update.
There is a clear correlation between the number of linking root domains pointing at a site and its ranking ability, particularly where the top 10 positions are concerned.
Evidently, there is a marked difference between the average volume of linking root domains across the winners and losers.
Sites fitting the winner’s profile have well over double the volume of linking root domains on average than the losers. Although the winners also lay claim to a higher number of links in total, the gulf here is much smaller in percentage terms.
Furthermore, the link-to-unique domain ratio of the winners is much lower than that of the losers. Generally speaking, the lower the ratio the healthier the link profile as it suggests a site’s links are spread more diversely across a greater number of sites.
Along with its emergence as the biggest emerging signal of the update, this is a strong indication that consideration of unique linking root domains is a prevalent feature of Penguin 2.0, rather than simply link volume in general.
With that in mind, does this suggest that a frenzied acquisition of links on unique domains is the way to go? Of course, modern SEO is rarely that simple. To dig a little deeper, let’s take a look at where those links are coming from. In order to do this, links have been classified into three categories based on their Domain Authority.
- Low Authority Domains – DA 0-30
- Moderate Authority Domains – DA 31-50
- High Authority Domains – DA 51-100
Interestingly, both winners and losers score almost identically in the low authority bracket.
However, the losers are characterised by a fatter middle with 10% more of their links deriving from moderately authoritative domains.
Winners meanwhile comfortably outscore the losers where links from the highest authority domains are concerned, pointing to such links as a key to unlocking top rankings post-Penguin 2.0.
The graph below shows the average percentage of links possessed by winners and losers in the above classifications.
Anecdotally, there may also be greater significance to the losers’ relative abundance of moderate authority links. For a multitude of reasons, many concerted SEO campaigns do not target links in the lowest authority bracket.
Conversely, links in the highest authority bracket are seldom accessible through the more contrived link acquisition Google is explicitly attempting to stamp out. With that in mind, artificially created links are most likely to occur in that middle section due to a combination of desirability and accessibility, and could well be at the core of the losers’ recent demise.
Much deeper analysis is needed to prove that definitively, but a similar diagnostic assessment may well be advisable if you’ve been actively engaged in more traditional link acquisition initiatives and have suffered ranking impact in time since Penguin 2.0 struck.
So, what are we to make of all this?
It’s highly unlikely that Google’s work is finished with Penguin, and whatever it’s had to throw at the SEO community today, more will undoubtedly be on its way in the coming months.
Evidently, the importance of securing links from a diverse range of high quality sites is on the increase. But perhaps even more important than that is the method used to acquire them. Google’s much-peddled mantra of ‘build great content’ has become somewhat clichéd in recent years; even bordering on the embarrassing in times gone by where crude link spamming techniques were abundantly effective.
However, we’re now half-way into 2013 and anyone still living in 2008 (or even 2012 for that matter!) is unlikely to have a prosperous 2014. If ever there was time that Google’s algorithm reflected its mantra, it’s now. The entirety of this piece of analysis could perhaps be summarised into the following piece of advice:
Engage in a robust, well rounded online marketing strategy, and don’t cut corners.
Note the use of the phrase ‘online marketing’, not ‘SEO’. This is pivotal. The genuine, best practice aspects of traditional SEO, such as sound on-page optimisation and engaging content (note: ‘content’, not ‘copy’), still undoubtedly apply.
However, today’s considerations for natural search success are increasingly transcending those of the past. Adjacent channels which may have historically appeared peripheral to a business’s ambitions in natural search are becoming increasingly integral.
PR and communication, in all its flavours (and I’d be inclined to include social media within that) matters more than ever. In actual fact, it goes even deeper than that. Potentially as deep and as fundamental as the business model.
Consider a scenario where customers cannot be acquired through natural search exposure derived via artificial quality signals (even if we may not be entirely ‘there’ yet) – what reasons would they have to do business with you, and not the competition? Matters as basic as price point, user experience and the very quality of the product (whatever that may be) may all require fresh consideration.
Leading the way in such areas will almost certainly manifest itself positively in terms of organic exposure in the long-run, not least through natural and non-engineered viral coverage and by enhancing the reception of any creative or PR-based promotional activity.
It has always been inevitable that a time would come when SEO’s traditional tricks of the trade would make way for a more varied and less contrived approach. If the SEO landscape of the last 10 years or so can be faulted for anything, it’s nurturing a culture of laziness and complacency.
Big brands haven’t necessarily had to innovate in order to prosper, and as a monopoly in the European search market at least, it’s incumbent upon Google to do something about that. Penguin 2.0 may not have taken us all of the way, but if nothing else, we’re another step down the road.
Appendix: defining winners and losers
Critical to our analysis was defining a list of winners and losers affected by Penguin 2.0. In order to do this, we took the top 100 positions for 500 high-profile terms spanning a range of verticals, including:
- Consumer retail.
- Finance and insurance.
Comparing ranking performance on 21st May (just prior to the update) and 24th May (just post the update), we assessed the estimated change in click volume per domain based on typical position-by-position click-through-rates.
A ‘winner’ is defined as any domain which has gained at least 20% and at least 3,000 of its tracked click volume. Conversely, a ‘loser’ is defined as any domain which has lost at least 20% and 3,000 of its tracked click volume.
Whilst click volumes and percentages quoted do not represent performance of the entire domain (as this study is based only on 500 major terms), they nevertheless present an indicative picture on Penguin 2.0’s impact on the site.