{{ searchResult.published_at | date:'d MMMM yyyy' }}

Loading ...
Loading ...

Enter a search term such as “mobile analytics” or browse our content using the filters above.

No_results

That’s not only a poor Scrabble score but we also couldn’t find any results matching “”.
Check your spelling or try broadening your search.

Logo_distressed

Sorry about this, there is a problem with our search at the moment.
Please try again later.

Google's Panda update was designed to eliminate spam and content farm content, thus improving the quality of Google's index and SERPs.

Many sites caught in Panda's grip claim that they were unintended victims of the update, and have sought ways to recover.

Many have been unsuccessful in reestablishing themselves with Google, but according to the Wall Street Journal, one publisher may have found the secret to recovery.

That secret: break up a site zapped by Panda into lots of smaller sites. HubPages, one of the 'content farms' hurt most by Panda, reportedly lost some 50% of its traffic following the update.

But by reorganizing its content into subdomain-based sites, a tip the Wall Street Journal says was provided directly to HubPages by Google's Matt Cutts, the company is seeing "early evidence" that such a move can bear fruit:

The HubPages subdomain testing began in late June and already has shown positive results. Edmondson’s own articles on HubPages, which saw a 50% drop in page views after Google’s Panda updates, have returned to pre-Panda levels in the first three weeks since he activated subdomains for himself and several other authors. The other authors saw significant, if not full, recoveries of Web traffic.

The idea behind breaking up into subdomains is simple enough to understand. By giving each HubPages author his or her own subdomain, it may be possible for Google to evaluate content on a more granular level.

Instead of punishing all the content located on the hubpages.com domain because some of it is lacking, subdomains that Google determines have quality content will be treated better than subdomains that don't.

Obviously, it would be foolish to assume that subdomains are a Panda panacea. Publishers without quality content probably won't see any benefit from a subdomain reorganization, assuming it offers any real benefit in the first place.

And if everyone rushes to employ the subdomain strategy in an effort to trick Google, it's likely that the strategy won't work.

Beyond this, there are also questions about how Google is working with publishers to deal with the fallout from Panda. The New York Times article creates the impression that Google engineers, and Matt Cutts himself, have been providing advice to HubPages, which reportedly generates $10m per year in revenue through AdSense.

Many publishers that claim they were unfairly harmed by Panda have not been able to obtain any explanations or assistance from Google. If HubPages' plight is actually receiving special attention, one has to wonder why.

At the end of the day, of course, Panda reminds us that Google isn't perfect. When it makes big changes, the consequences can be equally large for publishers in ways that are both good and bad.

For publishers dependent on Google traffic, trying anything and everything may be the only approach available, even if it does cause headaches, may be the only option when faced with the bad.

Patricio Robles

Published 14 July, 2011 by Patricio Robles

Patricio Robles is a tech reporter at Econsultancy. Follow him on Twitter.

2379 more posts from this author

Comments (4)

Avatar-blank-50x50

Will Wynne, Chief Executive at Arena Flowers

An interesting post...I'd seen this suggestion but does this approach not simply amount to saying "your site is tarnished, don't bother trying recover from it, you're better of binning it and starting a new site"? In Google's eyes a subdomain is a separate site, requiring new PR to be built up (excuse the short hand of "PR" but you see what I mean)...ie none of the link equity of the domain will pass unless you 301 from old site to new site, or link heavily from the main domain...both of which would be clear signals to the Goog that that is what you are trying to do.

In the same way that corporate structures often have multiple separate companies to parcel off risk (ie if one company in the group goes under, it doesn't take the others with it), using subdomains could be a way of siloing ranking risk...the issue to me seems to be that there is then loads of extra work involved in getting all the subdomains to rank...furthermore, if the subdomains are all built in a similar way and using similar templates they could then still be equally affected by Panda anyway...

Anyway, I don't mean to be overly negative on this one...I just think it may be more of a "technically this might be possible but it's not really that practical without loads of effort and even then possibly not" type solution, rather than a sure fire winner.

A Panda solution would be nice. But sadly for now I think the following still applies: http://www.youtube.com/watch?v=X21mJh6j9i4

about 5 years ago

Avatar-blank-50x50

Stuart Bradford

This sort of article really concerns me. I completely understand the marketing requirements to get the highest rankings possible. However, as Will mentions above creating sub domains feels like a lot of work for a quick win. If everyone starts creating sub domains Google will just start penalising them.

Instead I would focus attentions towards good content, usability, web accessibility and mark up.

With the introduction of HTML5, CSS3 and RDFa I believe we are going to see a real change for the better.
i.e. A well built site with well written unique content will rank higher

about 5 years ago

Malcolm Slade

Malcolm Slade, SEO Project Manager at Epiphany Search

This recovery that HubPages is seeing. It it happening to the core domain or are the sub-domains themselves generating traffic.

The general Panda fix is to "cut the crap" so fragmenting a site could be a way of doing this. "I have identified this area as low value so therefore I am going to move it off the main domain to a sub-domain".

Not sure this technique will be applicable to most people and would still go down the route of de-index your weak content, enrich it then re-introduce.

Would be funny to see competitors breaking their sites into 30 sub-domains and disappearing into oblivion though.

Have a good weekend everybody

Malcolm (@seomalc)
SEO Project Manager
Epiphany Solutions Limited

about 5 years ago

Dave Wieneke

Dave Wieneke, Director of Digital Strategy - ISITE Design at www.UsefulArts.us

Will and Stuart nail it - let me lend my voice as someone who managed millions against one of the most gamed keywords anywhere.

Breaking up sites to hub sites is a slope. Since you immediately loose page count, domain age, and links to the old domain you're behind competitors. So now one must "play catch up".

Putting those hub sites on totally different domains might boost their interlinking. But wait, Google can see they're sharing the same IP address. So, better to rent lots of cheap servers on different IP blocks so they seem independent.

And, maybe using variations of the owning corporation in the DNS record. Or, I've actually seen firms....form new corporations...to help them keep status w/Google and Adwords.

See the slope? Stay off the slope.

If Panda burnt you - starting over clean may be the only choice. But, in my experience, architecting domains for Google can put one back on the road to trouble.

Getting burnt twice could be career limiting, so proceed with care.

about 5 years ago

Comment
No-profile-pic
Save or Cancel
Daily_pulse_signup_wide

Enjoying this article?

Get more just like this, delivered to your inbox.

Keep up to date with the latest analysis, inspiration and learning from the Econsultancy blog with our free Daily Pulse newsletter. Each weekday, you ll receive a hand-picked digest of the latest and greatest articles, as well as snippets of new market data, best practice guides and trends research.