Googlebot Changes You Should Know
A cause of much speculation with webmasters in recent weeks has been the likely reasons behind a new addition to Google's Webmaster Tools; namely the
Fetch and Render option now available from the
Fetch as Googlebot section.
Google has announced that this new feature has been introduced to allow webmasters to see how Googlebot thinks your website actually looks like, and to reveal to the webmaster where items may be blocked by robots.txt file, resulting in Google getting an unclear picture of how your website is laid out.
Why is this important? Well it suggests that Google now frowns upon certain aspects being blocked to Googlebot, such as Javascript, Images and CSS files, presumably because these files could be masking content from the user, in order to cram in keywords or any other type of keyword manipulation.
It is likely that this move signals a potential penalty in the future for sites that block such content, and so we suggest you ensure that ALL files that are called from a web page that is indexed by Google, be allowed to be crawled by Googlebot, by editing your Robots.txt file accordingly. More information on search engine crawlers can be found
here
Published on November 07, 2014 20:46