There were some very good discussions late last year about geo-targeting neutral domains, focusing around pointing Google Webmasters at multiple XML sitemaps and country specific folders.

The next problem is how to target multiple languages in the same country for a multi-lingual website. I think I have found the answer...

For those unaware the consensus is to get a neutral domain like a .com domain, set-up country specific folders and create separate country specific XML sitemaps for the each folder. So targeting Ireland and the UK would have and

You then go into the webmasters portal and set-up each folder as a new website, add geo-targeting, then point webmasters to the sitemap. The consensus is that it works with higher indexation though at times a drop in rank. Splitting link equity across the folders from the top of the domain is the probable cause and this can be managed.

The next problem to be faced is how to target multiple languages in the same country or many countries for an international, multi-lingual website. What if you are Switzerland with multiple languages and you also have the US with Spanish and English. To answer the question we need to understand the fundamental trigger that Google uses to categorise a site as being in a particular language.

A friend asked me to look at their multi-lingual website the other day and I think I discovered the exact trigger.

The Google Translate website says "Google Translate is an automatic translator -- that is, it works ... using state-of-the-art technology instead. .. we feed the computer billions of words of text, both monolingual text in the target language, and aligned text consisting of examples of human translations between the languages.  etc".

I have to say from what I have seen this is complete rubbish. It feeds a myth that it looks at the entire website before classifying that website as being in Italian, French etc. Up until now, at least, that is what we have assumed. However .... It's more simple than that, it has absolutely nothing to do with what a user sees on the web page.

Then there is the domain signal. Having a .uk domain would say in the UK not in English. It's even more simple than that .... Google determines the language of a website by the Title and/or the Meta-description.

Have a look at these search results. The first thing you notice is that the Titles and Meta-descriptions are in Italian. The second is that Google is offering to translate every page. Click on anyone one of the search results however and you will see the page it wants to translate is in English. You see the same thing occurring with the French language portion of the website.

I looked at the code for language qualifiers and found none so the obvious conclusion is that Title and/or Meta-Description help Google determine the language and Google goes no further ignoring the headings, menus and body copy of the website.

So, if you have an international website that has countries like Switzerland or Canada that are multi-lingual, here is your best practice.

  1. Neutral domain verified in Google Webmasters.
  2. Country specific folders set up and targeted by country in Webmasters.
  3. XML sitemaps off each country folder ( and referred in Webmasters.
  4. Language targeting in title and meta-description through a language subfolder in each country i.e the French and German parts of your Swiss targeting <title>Parlez vous francais | Widgets Inc </title> plus meta-description in French and .. <title>Spreken ze Deutsch | Widgets Inc </title> plus meta-description in Swiss German.

You don't even need to translate the content. That is unless you want them to read it :)

Julian Grainger

Published 16 March, 2010 by Julian Grainger

Julian Grainger is an internet consultant and cotnributor to Econsultancy.

9 more posts from this author

You might be interested in

Comments (15)

Save or Cancel
Rob Mclaughlin

Rob Mclaughlin, VP, Digital Analytics at Barclays

Fantastic advice - thanks!

over 8 years ago


Andy Jaeger, Marketing Manager at Nursing and Midwifery Council

Have you got a view on language tags? We're currently building a multi-language section of our website, and wondered if you thought using the lang= property in the header was useful and whether it would improve search rankings.

over 8 years ago


James Royal-Lawson

I wrote about search engines and how they deal with the language of your site a few weeks ago (Although I didn't specifically talk about this issue: namely how you deal with multiple languages on the same (neutral TLD) site.

Note too that language detection varies a bit between the search engines - for example, Bing makes use of the content-language meta tag (as do Yahoo, kind of).

Andy: the lang attribute should always indicate the default language of the content on the page. This is important from a web standards viewpoint. If you have other languages on the same page, the tags that surround that content should have an "overriding" lang attribute.

over 8 years ago

Julian Grainger

Julian Grainger, Director of Media Strategy at Unique Digital

For Google the language metatag is akin to the expiry metatag. They are often left on the default provided by the content management system so Google ignores it. I think the key take out for many Google on-page meta signals is that they only use meta that is actively managed. Yahoo, under the TrustRank system, would require language definition to take place prior to their seed page evaluation. It would have to be in the same language for it to work so given this is content based it would have to be explicit like the lang tag or solely based on content.

over 8 years ago


Catherine Hibbard

Thank you for posting this.  I plan to call this to the attention of my participants in the technical writing training seminars I teach.  This question has been asked numerous times in the past.  Now I will provide a link to your posting for reference.

over 8 years ago



We have been doing it for a long time for client websites with positive results. One tip i want to add here is - get a few links to your language specific webpage from web pages of similar language. Results will be better.

over 8 years ago

Julian Grainger

Julian Grainger, Director of Media Strategy at Unique Digital

Thank you Catherine. Laaki, that is great can you show us some examples and results please?

over 8 years ago


Alec Campbell

Fascinating research and I am very surprised that Google would rely on such a simple trigger as the title tag to determine the language of an entire web page. However, this is really only going to be an issue in those instances where a company has translated pages of a site but forgotten to translate the tags which would be a massive oversight for any company engaged in a global search engine optimization initiative.

I would disagree somewhat with your conclusion which I think is an overgeneralization. For many companies, having country-specific websites (on country-level TLDs) is a better strategy than one "neutral" domain with country-level folders. Even though Google has recently decreased the importance of the TLD as a factor in ranking, having country-specific sites allows the company to better target and interact with customers in each country in which it has operations or sales. But this does of course depend on whether a company has the resources to manage several versions of their website. 

over 8 years ago



@julian - i would love to! But as you might know, we are bound by NDAs.

over 8 years ago

Julian Grainger

Julian Grainger, Director of Media Strategy at Unique Digital

@Alec. My reasoning comes for a neutral domain comes from the advantages an aging link profile provides. If you have a fresh domain everytime you launch into a country you start from scratch each time. Any linking between sites to pass juice has co-citation discounts. Multi-lingual neutral domains also target across country so you can hit South America (except Brazil) in one stroke through a Spanish. Country specific TLD's are geo-signalled immediately and you lose all these options.

over 8 years ago

Julian Grainger

Julian Grainger, Director of Media Strategy at Unique Digital

@ laaki Your website is full of client testimonials and logos of your customers. But you only seem to have one international customer, Arriva, and there is no evidence on any of your clients websites that you are implementing this stuff. At the moment you look like blog spam. You only need to point us to the XML sitemap :)

over 8 years ago

Ashley Friedlein

Ashley Friedlein, Founder, Econsultancy & President, Centaur Marketing at EconsultancyStaff


A great article. Thanks. You’re one step ahead of us (that’s Econsultancy, this site) in as much as languages are *next* on our list after the initial country-level targeting split out.

You say that “The consensus is that it works with higher indexation though at times a drop in rank. Splitting link equity across the folders from the top of the domain is the probable cause and this can be managed.”

I was interested in the throw away ‘this can be managed’… how? 

As a bit of background…

We’re planning to create, broadly speaking, two views/versions of this site. One that is tailored for a US audience which will live under… and the other which is for a UK audience which will live under… 

This is to make the user experience for more relevant for those audiences. For example, we don’t offer public training courses in the USA so there’s not much point showing a great big list of 25 courses to that audience. 

So we have a number of questions / challenges we’re chewing over as follows and would love your input (and anyone else):

1. “Where you are” – users and search engines

When we split out the sites we have the challenge of letting search engines know ‘where they are’ in the site (UK or US version) which you have brought up some interesting ideas on in terms of language (though both of ours are English) and using Webmaster Tools to geo-target by sub-directory. 

This is also a user interface challenge, of course: if someone from the US clicks through to a /uk/ URL how should we treat this? e.g. redirect to the nearest US equivalent? Or just flash up a message (like Amazon do) saying we think they might want to look at the US equivalent? Or just use flag icons or other country denominators to show to the user where they are and allow them to navigate away/around? 

Currently we’re thinking the last option though if the user comes through to an old-style ‘master level’ URL then we’ll redirect them on the correct /uk/ or /us/ version depending on where they are in the world (based on IP address). 

2. Rankings and Google Juice

We currently have lots of inbound links, no problems with indexation, and good SEO rankings. So our concern is, when we ‘split out’ the sites into URL directory sub-sets, how we ‘(re)assign’ all that Google juice? Currently all the links go to non-country-specific URLs but all the new ones will have /uk/ or /us/ in them. So how best to redirect the spiders to the new URLs? 

It seems that the safest way would be to 301 redirect all our existing inbound links to the equivalent new pages in the *UK* part of the site so that we don’t at least lose our rankings there (as this is our main market). But then the US site will effectively be starting from scratch? But perhaps that’s the only way not to dissipate existing link equity across two sub-directories? 

The problem is that we assume the UK site will still outrank the US site, even for US searchers in the US, because Google will see the UK site as way more authoritative. Any ideas what best to do? Do we just accept that US users will still come to the UK site and we need to redirect them via the UI? Obviously we’ll need to focus on building a good link profile for the /us/ version of the site too. 

3. Duplicate content concerns

We have some content pages where the content for the UK and for the US will be significantly different e.g. and because what we have to offer to each audience is different. We think this will be fine from Google’s point of view as these are two different URLs with different content. 

However, for an ‘international’ flavour blog post the *same content* will appear on two different URLs e.g. and – so do we have to worry about duplicate content here? And, if so, we’re thinking it’s less a case of getting *penalised* for doing this and more that one will rank better than the other…? 

Currently we’re thinking that where the content is very similar across two URLs then we could use the canonical tag to tell Google which of the URLs to treat as the master one so there is only 1 URL and 1 version of the content in Google’s eyes. However, this has the downside that users might come through to the ‘wrong’ URL for their locale (see Point 2 above). 

4. Indexation – with Googlebot as a “US user”

We’ve never had any problems with getting indexed. However, it seems clear that Googlebot crawls as an anonymous US user. So if we have content (like all our UK training courses) that appears to a UK user but *not* to a US user (Googlebot) will it get ignored…?

Our thinking is we’ll be OK on this one as a) there are plenty of external links to our UK content which Google can follow to index the content b) internal links via sitemaps will also allow this c) sitemaps in Webmaster Tools should also ensure Google can see all the UK content. 

over 8 years ago

Julian Grainger

Julian Grainger, Director of Media Strategy at Unique Digital

Hi Ashley You don't ask easy questions do you!

First the 'this can be managed' comment is about link consolidation into the sub-directories. There are some caveats:

1. I'm not saying there will not be loss. What I am saying is that it doesn't have to be tragic.

2. I'm assuming a site will have to change URL's by introducing country directories.

3. You are going to have to be ruthless in prioritising what you can lose in the short term.

4. Remember root domain links will remain where they are and these are likely to be the biggest chunk of your link graph.

Personally, though I don't think the solution outlined above is appropriate in your case anyway (why later). This is for a later blog as we are mid-project but the plan we've put together for one company is:

1. Download and crawl all sub-directory links to gain anchor text and language.

2. Literally divide these into country, topic and language and create a plan allocating these to each country directory .

3. Include a neutral, global content directory for multi-country content.

4. Redo site the site and 301 redirect all pages according to your planned allocation. Add the XML sitemaps remembering to include a global sitemap off the root as well as the sitemap index file showing the locations.

5. Geo-redirect users to country specific content based their IP address. What we are seeing are multiple countries for the website in the SERPs with a gradual changeover to the correct country content as the link graph builds for each geography. After a while we will be able to take off the geo-redirect.

As I said, I'll blog about this when we have it cracked.

However ... in your situation I would not feel compelled to de-neutralize the domain and target countries using XML sitemaps and sub-directories. My reasoning is this:

1. For me, two counties in one language is precisely why we have neutral domains like .com. There is no penalty for operating one and as long as the link geo-signals are correct you will rank as well as any country specific domain.

2. You don't have any indexation problems.

3. You can use US spelling for US content.

4. You can still use country and global content directories on one sitemap to ensure you do not duplicate content.

5. Your issue is garnering geo-signals in your links and that does not require geo-targeted XML sitemaps.

6. If your UK training content is linked, Google will find it.

7. You don't want to lose your other English language viewers.

When I go home to NZ for a holiday I would still like to find this site on the .nz search engine. :)

over 8 years ago

Julian Grainger

Julian Grainger, Director of Media Strategy at Unique Digital

I see where you are going with this but for me I think you might hit a path where splitting link equity across duplicate content will mean nothing ranks.

In terms of local, the .com is US so introducing a country domain like .uk will cause more of the pain you have already experienced. You could try local server locations for each country as another signal.

Another option might be to operate like a News service and feed content headlines in to the local headline pages. That way the content only lives in one location but is accessed from multiple geo-targeted summary locations. So your meta is really directing where the content shows at the summary level. This would mean only the entry pages to each topic (jobs, events, training) could country specific and could be delivered based on IP. The actual content is delivered based on user journeys. Alongside a user choice to switch countries at that summary level you ensure the bot will crawl to each country delivered summary page.

You should cover your bases without increasingly duplicating pages. You could also do this without moving your entire website to sub directories thereby maintaining your link graph. And you can do this with one big sitemap in neutral and 2 geo-targeted sitemaps pointing to the summary pages only.

over 8 years ago



MA Consulting We offer high quality legal consulting related to incorporation of entities of any forms of ownership, including LLP, JSC, etc. in Almaty City. We will provide legal assistance in incorporation of entities with a foreign interest.

about 5 years ago

Save or Cancel

Enjoying this article?

Get more just like this, delivered to your inbox.

Keep up to date with the latest analysis, inspiration and learning from the Econsultancy blog with our free Digital Pulse newsletter. You will receive a hand-picked digest of the latest and greatest articles, as well as snippets of new market data, best practice guides and trends research.