AJAX Friendly SEO Techniques

AJAX (Asynchronous Javascript and XML) is a technology used in almost 80% of modern websites. It’s comprised of a couple of different older technologies. Javascript is used to make asynchronous calls from a web page sitting on your browser back to the web server that it was served from. Asynchronous means that the call is made in the background without keeping the user from being able to use the site while the data that has been requested from the web server is being loaded. The data is transmitted from the server back to the client using XML. XML is a way of formatting a long string so that it can be broken up and read in a meaningful manner.

The web page then reads the XML string and only populates the section of the web page that the data was requested for instead of reloading the entire page. This makes AJAX great for the user experience, because it gets rid of the “click and wait” game that the user usually has to play with web pages. It helps to make web pages behave more like windows applications and provides a much richer user experience. Traditionally, AJAX has not been very SEO friendly for a few different reasons. Below are some challenges that AJAX provides for SEO and methods to get around them.

Probably the biggest problem that AJAX presents to SEO is that most AJAX enabled sites don’t load multiple static URLs. Instead AJAX enables web sites to load dynamic content on the same page without ever having to load another page. This is not good for SEO because search engine bots, spiders, or crawlers don’t have anything to read since the text or data is never loaded onto the page at the same time. Often times the text is loaded from a database or an XML file that can’t be read by a search engine crawler.

This can be avoided by adding the text onto the page all at once, if there isn’t too much of it, in hidden containers and then showing and hiding the text at the right time using client side javascript instead of AJAX calls. Then you can save the AJAX calls for when the client actually needs to interact with the server. Static pages can also be constructed that contain the text that is dynamically loaded by the AJAX. Even though you may never navigate to these pages through AJAX they will still be able to be indexed by a web crawler by placing them in a sitemap that can be navigated by browsers that don’t support javascript. They will have their own URL that can be navigated by the crawler.

Another way to maximize SEO on AJAX enabled pages is to serve up alternative content. If you have data or links that get written dynamically through AJAX calls try to also provide it in alternative ways that can be read by crawlers. If you have an AJAX enabled menu structure at the top of your page, try including a menu at the bottom of the page. Construct your footer menu using plain HTML that navigates to static pages containing the text that is loaded dynamically using AJAX. This will give the spider URLs and text to crawl, but it will also give the user the rich experience that AJAX can provide and provide a means of navigation for browsers that either have Javascript disabled or that do not work with Javascript.

The main thing to remember is that web crawlers need to be able traverse the links in your site and read the text contained on your pages. If they can’t do that your pages won’t be SEO friendly no matter what technology you’re using.  If you have (or are thinking about using AJAX) and want some good old-fashioned SEO advice, inquire about our SEO Audit Services at Volume Nine.