{{ searchResult.published_at | date:'d MMMM yyyy' }}

Loading ...
Loading ...

Enter a search term such as “mobile analytics” or browse our content using the filters above.

No_results

That’s not only a poor Scrabble score but we also couldn’t find any results matching “”.
Check your spelling or try broadening your search.

Logo_distressed

Sorry about this, there is a problem with our search at the moment.
Please try again later.

There is at least one constant in the everflux of Search landscape; the importance of meta tags ebbs and flows. Is 2007 a year where the value of meta tags increases again?

The halcyon days when some light on-page optimisation combined with clever use of meta keywords was enough for Search success have gone. Given the competitive organic search market we have now it seems unlikely that we will see those days again.

However, with the brand new "noydir" meta tag from Yahoo, "noodp", "y_key" and Google's "verify-v1"  we have begun the year with meta tags having a very real impact on search.

NoYDir and NoODP are cousins. Each tag allows you to opt out of the search engines' current tendency to let a significant directory, such as the Open Directory Project, describe your site. The NoYDir tag, from Yahoo, is a way to pull your site away from the influence of Yahoo's own directory.

Why could a directory description be a problem? If back in 2002 someone, somehow, described your chain of luxury retail outlets across the United Kingdom as "an expensive clothes shop in Soho which imports leather handbags from Italy" in the Open Directory Project then you would certainly be advised to implement NoODP. If you do not, or cannot, add NoODP to your robots meta tag then Google could describe your site in those very same words when a would be customer searched on your brand. Ouch.

I have heard other "experts" recommend that site owners remove the Robots meta tag from their site. They argue that it is a waste of space and unnecessary. This is a mistake.

Currently the search engines assume they are allowed to visit your site. The search engines also assume that they are allowed to use your content to build their search index. If you do not want search engines to do this then you can also use the Robots meta tag to say so.

In the SME market where web site changes are easy, fast and inexpensive it is often a trivial matter to add or subtract meta tags to pages. This is perhaps why some SEOs view the Robots meta tag as an easy-come-easy-go luxury. This is not the case for big brands with complex sites, often with expensive Content Management Systems, compliance issues and a workflow protocol.

When I am working with a client putting in a CMS or rebuilding their site I always make sure they are aware of the Robots meta tag. Can you imagine spending tens of thousands on a CMS only to discover later that you cannot block search engines from a page intended only as an affiliate landing page because your system does not support the Robots meta tag? (The robots exclusion protocol may be a possible alternative in that example.)

The Robots meta tag could become dramatically more important over night. The assumption that search engines can enter your site and can index your content unless you say otherwise could be challenged.

In Belgium Google lost copyright cases and is no longer allowed to include some news sites in Google News. In the future, a future which might not be all that far away, sites may need to say "index,follow" in their Robots meta tag in order to be search engines at all.

The Internet Archive is currently being sued by a website owner who argues that its spiders entered into and broke a contract when it indexed her site.

We might see other meta tags becoming more important in 2007 too. Back in 2004, during London's Search Engine Strategies, Matt Cutts announced that Google had looked at the language / country meta tags as one possible way to determine the target audience of the site. At that time Google's analysis concluded that the language meta tags were used too infrequently and were too prone to being incorrect.

Three years is a long time on the internet and a very long time in Search. We are due to see better geographical targeting for organic search and a return to language / country meta encoding is a likely possibility here.

Andrew Girdwood is head of search at Bigmouthmedia.

Andrew Girdwood

Published 26 March, 2007 by Andrew Girdwood

Andrew Girdwood is Head of Media Technologies at Signal and a guest blogger for Econsultancy. He can be found on Twitter here.

41 more posts from this author

Comments (0)

Comment
No-profile-pic
Save or Cancel
Daily_pulse_signup_wide

Enjoying this article?

Get more just like this, delivered to your inbox.

Keep up to date with the latest analysis, inspiration and learning from the Econsultancy blog with our free Daily Pulse newsletter. Each weekday, you ll receive a hand-picked digest of the latest and greatest articles, as well as snippets of new market data, best practice guides and trends research.