Google’s robots to attack AJAX

Google launches a proposal to the Web community to improve the ranking of pages using AJAX, virtually invisible to its spam so far.

For referencing a website, it is not that Flash is a problem. Google’s robots do not like either the JavaScript or not the AJAX (Asynchronous JavaScript and XML).

A boon for some, as explained by Jean-Michel De Sousa, an expert in SEO Sartepenso: "Some spammers use this flaw to make invisible certain parts of sites" and then pass to a site that is not .

But a headache for most, as stated Olivier Duffez, a consultant at Webrankexpert. "I strongly discourages the use of AJAX on the parts that generate content. It should be reserved for layers that enhance the user experience. "He continued:" The JavaScript code is considered by Google as it’s fairly simple, especially when it contains links to Web pages. But it quickly becomes inaccessible to spam search engines. "

A change in the syntax of URLs

To make AJAX visible by robots, Google on one of its official blogs, made a technical proposal to open the Web community: in this case, change the syntax of some URLs. Including those using the anchor "#". It is traditionally used by webmasters to indicate a specific subsection of a Web page, when creating a summary, including. But it is also used for passing parameters in JavaScript functions, and spam the search engine have trouble interpreting them.

Google proposes to add the sign "!" To complement the character "#" for Web pages that contain code AJAX Upon exploration sites, "Google would replace the"! "In a URL _escaped_fragment_ tag.

This tag would be a way to tell the Web server that the site visited is interviewed by a spider and it must in that case, send HTML instead of JavaScript. On a request made by a standard user (which does not contain this tag), the server would continue to return the original page, the JavaScript is executed client side by the user’s browser. The servers of the search engines themselves could be responsible for the execution of JavaScript, but Google does not this option as being too time consuming to load, and disadvantageous to smaller search engines.

If this proposal were accepted, it would therefore add exclamation points to some URLs of sites and ensure that the server can interpret the parameter _escaped_fragment_.

Referencing a subtle art

"Conversely, it must also be wary of parts of sites which were hitherto invisible by default and would be indexed in the future, such as management of baskets on merchant sites. For example, using JavaScript in. Js and indicating in the robots.txt file does not crawl the files, "explains Oliver Duff.

Currently, websites have the problem choose to generate HTML pages in duplicate for indexing or to book the JavaScript code for parts of the site need not be referenced. This is confirmed by Elie Auvray, CEO of Jahia. "Interfaces for publishing and management based on Ajax. But when it goes into monitor mode, sending Jahia the visitor’s browser to the HTML entirely conventional. Even when the page generated by this interaction as an opportunity to post a comment. "

LEAVE A REPLY

Please enter your comment!
Please enter your name here