Google, Yahoo! and Microsoft are teaming up to develop a unified way for webmasters to update their crawlers.

The three search giants will support a new version of the Sitemaps protocol, which was first released by Google in June last year.

The move will create a common system for site owners to update the three search engines’ crawlers, rather than going through a separate process each time.

The three firms said they would work together to develop Sitemaps and publish enhancements on a jointly maintained website.

Danny Sullivan, editor in chief of Search Engine Watch, welcomed the move:

“This is a great development for the whole community and addresses a real need of webmasters in a very convenient fashion.

“I believe it will lead to greater collaboration in the industry for common standards, including those based around robots.txt, a file that gives web crawlers direction when they visit a website.”

The three search engines hope the move will provide their users with fresher results, while also cutting costs for sites with a large amount of web pages and dynamic content.

Through the system, webmasters make an XML file available on their sites, which lists all of their URLs along with metadata that tells search engines’ spiders which pages to crawl.

As a result, they say new content can be made available on their indexes more rapidly, helping site-owners drive traffic.


Published 16 November, 2006 by Richard Maven

529 more posts from this author

You might be interested in

Comments (1)


Hong Xiaowan, Hong Xiaowan's studio

over 11 years ago

Save or Cancel

Enjoying this article?

Get more just like this, delivered to your inbox.

Keep up to date with the latest analysis, inspiration and learning from the Econsultancy blog with our free Digital Pulse newsletter. You will receive a hand-picked digest of the latest and greatest articles, as well as snippets of new market data, best practice guides and trends research.