CEO at Econsultancy
06 April 2009 14:22pm
Have a look at this graph showing Econsultancy's referrals from natural search from Google over the last months.
You can see that at the beginning we were getting around 5,000 referrals a day. This nosedived towards the end of December 2008 before picking up again 10 weeks later in late Feb 2009. Last week you can see the referrals falling off a cliff again...
The first drop is due to our site migration and the seo impact which is extensively documented. This, whilst frustrating, was at least understandable given the size of the changes we made.
The latest nosedive is more of a puzzle to us as we're not sure what we've done to cause it. So a prize to whoever can help us figure it out!
(it doesn't appear to be a penalty as we're still ranking for a search on our own name etc - just a massive 'downgrading' of our domain.)
The only 2 things we can think of the moment are:
1. Adding 'no follow' to our blog comments and forum posts
In accordance with what we felt was Google's view on best practice, we recently changed it so that ALL blog comments and forum posts on our site (including the archives) now have the 'no follow' attribute for links. This is to help avoid spam for SEO reasons and as this is not Econsultancy "editorial content" then perhaps we shouldn't be passing pagerank. But many of these links are also internal to our site. Could the big change in PageRank/linkage data somehow have freaked Google out?
2. Geotargeted page content is being seen as duplicate content?
We've also recently started to personalise the content of pages further according the user's geography. So, on the same URL, you might see something that is similar, but different, to a user somewhere else in the world. For US users we use USA spelling and pricing in dollars for example. Could Google see this somehow as duplicate content?
Feel free to post any thoughts as replies to this. As ever, we're keen for others to learn from our mistakes!
Head of SEO & Social Media at Make It Rain Ltd
06 April 2009 15:34pm
Your 2nd theory might be the answer.
What does the Googlebot see? If it's different to users then they might have, however harshly, penalised you for suspected cloaking. Would be very harsh but it only thinks in binary after all. I would have thought you'd be able to appeal this and get resolution very quickly as there's nothing black hat going on.
06 April 2009 16:34pm
The Googlebot sees exactly the same as the users sees - but different users see different things. This is pretty common e.g. in showing things in different languages based on geography.
However, I don't know 'where' the Googlebot 'is'? In the US, in the UK, both? Presumably both. Just possibly it's getting confused by seeing two versions of the page which are very similar and freaking out as a result?
But we're certainly not cloaking or showing anything to Google that's any different from what users see. In fact, we're making the user experience much more relevant and better which Google encourages...?
When you say 'appeal'? Who to and how? Over 3 years ago I suggested that Google should provide an appeals mechanism but I don't think it exists? There is the 'reconsideration request' in Webmaster Tools but there is no guarantee of a reply and no timescale for when anything will be done so it is completely opaque.
web coordinator at s&p
06 April 2009 17:00pm
I have seen a drop in referrals from natural search from Google since predictive search terms were implemented last week. ?
06 April 2009 17:03pm
Without having access to your data or indeed working at Google I can't be sure that this is the problem but it might be. Would seem very harsh and as you say it's providing increased relevancy, Google's mantra.
You could try twitter.com/mattcutts . I've seen him reply to webmaster questions on Twitter before so he might help you out. If you don't get a reply you could try rolling back the changes and see if that helps when you get recrawled, differential diagnosis House style.
06 April 2009 17:04pm
@ Katie - that's interesting. I don't *think* that is what's happening for us as we have noticed our actual rankings (and hence referrals) drop quite suddenly irrespective of what is being searched for. But I can see that predictive search could encourage people to search on the same head end terms (and so 'long tail' players could suffer).
06 April 2009 17:08pm
@ Luke. Yes, we've tried the Matt Cutts on Twitter route. Bet everyone does!
I'm loathe to start to change things which we fundamentally don't think are wrong in the first place and that goes against what Google would say I'm sure - rather than try to play the SEO game you should focus on your users and the customer experience. Doesn't quite cut it, of course, when your search dries up!
Also, all the suggestions so far are, at best, informed guesswork. I'd rather make changes based on fact!
Technical Project Manager (MBA, MBCS, CITP, CEng) at Naxtech.com
06 April 2009 18:10pm
It's not easy to say by looking just at the surface of the issue. If we had your web server logs and spent some time analysing your site and google analytics then I would be able to make a more informed recommendation.
You could potentially look into duplication of content: look at use of identical titles across the site as well as what is served to US and UK visitors, if you server the final page/code directly.
Remember that different pages of the site might have been read/scanned
by different google servers, in different datacenters. Depending on
which one you "ask" you might get a different answer. So, it may not even be your fault (not yet anyway!).
Without knowing more about the site to make an informed suggestion, I'd say to give it a bit more time and see if things get back to normal again.
I know this is not a solution per say and it's hard to recommend someting without spending some time analysing things but I hope the above help.
E-Business Consultant at Dan Barker
07 April 2009 08:35am
morning, Ashley, how are you?
Is this across all terms, or some specific 'big' terms for which you've slipped in rankings?
Does it affect traffic from all countries? eg, you can set up an 'advanced segment' for the USA & another for the UK, then in your Google traffic report you can see how those segments shift, rather than just looking at the top line.
Another odd thought - do you have partnerships with any universities (I know, for example, the IDM does). I'd guess most students are now on easter break, which could affect that?
I hope that helps - looking forward to further info.
07 April 2009 09:32am
@Dan - it's across pretty much all terms, though ironically we're still top of Google for a search on SEO best practice - gotta love that!
Terms where we've gone from no.1 to pretty much nowhere are longer tail ones like 'digital marketing jobs', 'online pr and social media', 'email marketing training'.
All countries are affected similarly.
I don't think the Uni think is relevant - it's not the search volume/demand that's the problem it's just that we've dropped suddenly in the rankings and therefore getting way fewer referrrals.
The SEO Best Practice: Index Inclusion Guide is part of Econsultancy's renowned SEO Best Practice Guide and is has been created with the help and frontline insight of globally-esteemed SEO practitioners, in order to give you the edge in your natural search marketing activity.
Econsultancy's SEO Agencies Buyer’s Guide focuses on the UK search engine optimisation (SEO) marketplace and contains detailed information about the trends and issues affecting this particular sector of digital marketing. The report contains extensive insight about how to find the right natural search agency to meet your needs, and profiles 36 leading UK agencies.
Free market research on digital marketing
Daily Pulse: award winning newsletter
It takes 30 seconds to register