{{ searchResult.published_at | date:'d MMMM yyyy' }}

Loading ...
Loading ...

Enter a search term such as “mobile analytics” or browse our content using the filters above.

No_results

That’s not only a poor Scrabble score but we also couldn’t find any results matching “”.
Check your spelling or try broadening your search.

Logo_distressed

Sorry about this, there is a problem with our search at the moment.
Please try again later.

Every day, more and more web designers and developers are taking advantage of new technologies and tools that enable more enjoyable user experiences. From jQuery to Flex, designers and developers have no shortage of options for building more intuitive, responsive and efficient websites.

But many, if not most, of these options come at a price: an SEO hit. That's because it's difficult for search engines to make sense of content that is controlled by these technologies and tools.

AJAX, or asynchronous JavaScript and XML, is a favorite collection of client-side technologies that many designers and developers use to build interactive websites. But AJAX-based websites aren't very popular with search engines because content controlled by AJAX is essentially invisible to search engines most of the time.

Google, however, thinks that "making this content available for crawling and indexing could significantly improve the web" and is looking to change that. The search giant has released a proposal for crawling AJAX-based websites. For non-techies interested in the 30,000-foot overview, it would work like this:

  • Some URL trickery would enable Google to identify AJAX pages that should be crawled.
  • Google would call said AJAX pages and the web server would use a special browser (called a headless browser) installed server-side to render the page as a real user would see it and return a "snapshot" to Google.
  • The "snapshot" would be crawled by Google and the page could be included in Google's index.

It's a simple enough solution that seems entirely logical and viable. Unfortunately, however, the requirement that websites provide the headless browser is a big drawback that I believe will significantly limit how useful this proposal is. Google's desire to have websites provide the headless browser is understandable (it shifts some of the crawling process' resource burden from its own crawler to the website's server). But the reality is that not everyone who uses AXAJ has the ability to install a headless browser on their server, even if they know about the possible SEO benefits available to them by implementing Google's final proposal.

That said, Google's proposal is a step forward. Right now websites that make extensive use of AJAX and similar technologies are largely being left out in the cold. So just about anything is an improvement and it's good to see Google making an effort to crawl the growing portions of the web that it isn't right now.

Photo credit: adactio via Flickr.

Patricio Robles

Published 8 October, 2009 by Patricio Robles

Patricio Robles is a tech reporter at Econsultancy. Follow him on Twitter.

2429 more posts from this author

Comments (6)

Comment
No-profile-pic
Save or Cancel
Avatar-blank-50x50

Vincent Roman

AJAX requests and functionality do not need to automatically come at a hit to SEO.  If you plan, code, and roll out your sites properly the ajax content can quite easily be indexed as if it has it's own page. I would hate ot say that it comes down to sloppy work if it does.  It's a bit like worry about flash and SEO when you can build a properly SEO-ed HtmL version of the same thing!

about 7 years ago

Patricio Robles

Patricio Robles, Tech Reporter at Econsultancy

Vincent,

A big problem is that most designers/developers are not SEOs and many business owners who opt to use tools and technologies like AJAX and Flash are not going to invest in workarounds for SEO.

Maybe that's not smart and in some cases, it might even be stupid, but I think Google recognizes that most people aren't going out of their way to make content produced on the client-side accessible to crawlers and given the increasing amount of this content it's missing out on, it has to try a different approach.

about 7 years ago

Avatar-blank-50x50

kiren

So if google crawls this ajax will all the ajax entries become keywords to that page?

about 7 years ago

Avatar-blank-50x50

Vincent Roman

Patricio

Somehow alas, I don't think if they have the time or money to develop a site properly they will bother making sure the site ajax-ed content can be indexed properly.

I agree Google needs to make efforts to get content better indexed to improve their search result offering, but beyond that most web devs or clients wont pay much lip service to it if they have to jumo through any hoops at all.

Call me a cynic, but i think many would agree!

about 7 years ago

Avatar-blank-50x50

SeoNext

Heh, hence the 'I think' I don't know exactly what combination of characters/character that google bot has problems with but it does appear to have some ;)

Like I said I always use a rewrite rule to get around this so the page URLs are SEO friendly except on things we don't want indexed so its not something I've looked into too heavily. Theres no real reason not to do this to prevent any possible problem if you want your SEO to be good on a dynamically produced site.

about 7 years ago

Avatar-blank-50x50

lee

My site got indexed but only 1 page, I have been now waiting for weeks to update my website keywords and other pages. It take s forever google to do anything to your site if it has no pr.www.workonlineathome.co.uk

about 6 years ago

Comment
No-profile-pic
Save or Cancel
Daily_pulse_signup_wide

Enjoying this article?

Get more just like this, delivered to your inbox.

Keep up to date with the latest analysis, inspiration and learning from the Econsultancy blog with our free Daily Pulse newsletter. Each weekday, you ll receive a hand-picked digest of the latest and greatest articles, as well as snippets of new market data, best practice guides and trends research.