Google, Yahoo! and Microsoft are teaming up to develop a unified way for webmasters to update their crawlers.
The three search giants will support a new version of the Sitemaps protocol, which was first released by Google in June last year.
The move will create a common system for site owners to update the three search engines’ crawlers, rather than going through a separate process each time.
The three firms said they would work together to develop Sitemaps and publish enhancements on a jointly maintained website.
Danny Sullivan, editor in chief of Search Engine Watch, welcomed the move:
“This is a great development for the whole community and addresses a real need of webmasters in a very convenient fashion.
“I believe it will lead to greater collaboration in the industry for common standards, including those based around robots.txt, a file that gives web crawlers direction when they visit a website.”
The three search engines hope the move will provide their users with fresher results, while also cutting costs for sites with a large amount of web pages and dynamic content.
Through the system, webmasters make an XML file available on their sites, which lists all of their URLs along with metadata that tells search engines’ spiders which pages to crawl.
As a result, they say new content can be made available on their indexes more rapidly, helping site-owners drive traffic.