In March of 2015, it was reported that Google will discontinue the AJAX crawling proposal that was created in 2009. And now, Google has officially discontinued its support for the AJAX crawling proposal.
“Historically, AJAX applications have been difficult for search engines to process because AJAX content is produced dynamically by the browser and thus not visible to crawlers. While there are existing methods for dealing with this problem, they involve regular manual maintenance to keep the content up-to-date.”
This means that Google is not recommending the proposal anymore but Google will still be available to discover AJAX website and render the pages. They added this change to discourage users from blocking Googlebot. Google will still discover AJAX only if the site is not blocking Googlebot from crawling on the JavaScript or CSS files.
This could potential be hurtful to website providers that have used excess AJAX in the process of their development.
Google’s new suggestion is to build sites that use the principals of progressive enhancement. “Progressive enhancement is a strategy for web design that emphasizes accessibility, semantic HTML markup, and external stylesheet and scripting technologies.”
If the Google AJAX proposal has already been implemented, it will still be indexed. Practices are improving with time and Google suggests changing your website to ‘the new best practices over time’.
Q: My site currently follows your recommendation and supports _escaped_fragment_. Would my site stop getting indexed now that you’ve deprecated your recommendation?
A: No, the site would still be indexed. In general, however, we recommend you implement industry best practices when you’re making the next update for your site. Instead of the _escaped_fragment_ URLs, we’ll generally crawl, render, and index the #! URLs.
Q: Is moving away from the AJAX crawling proposal to industry best practices considered a site move? Do I need to implement redirects?
A: If your current setup is working fine, you should not have to immediately change anything. If you’re building a new site or restructuring an already existing site, simply avoid introducing _escaped_fragment_ urls.
Q: I use a JavaScript framework, and my Web server serves a pre-rendered page. Is that still ok?
A: In general, websites shouldn’t pre-render pages only for Google — we expect that you might pre-render pages for performance benefits for users and that you would follow progressive enhancement guidelines. If you pre-render pages, make sure that the content served to Googlebot matches the user’s experience, both how it looks and how it interacts. Serving Googlebot different content than a normal user would see is considered cloaking, and would be against our Webmaster Guidelines.
Various web design firms across Florida offer Search engine optimization which is a crucial part of increasing organic traffic to…
The world of Google has altered the way business marketing is done, and it will continue to affect marketing strategies…
Many internet marketing strategies take time and investment in order to achieve the desired results, but there is a much…