Monday, 22 June 2015

New Robots.txt Tester to Identify Errors

type='html'>
Google has recently launched a robots.txt tester to test the robots.txt file for any errors. This new testing tool can be found under the Crawl section in the Webmasters tools. This tool is incredibly helpful in testing new URL's to identify whether they are disallowed for crawling.



You can now test the directives of your robots.txt file and check whether it is working properly. Once you are done with the testing, you can upload the file on the server to bring the changes into effect.

With this tool, you will also be able to review the older versions of your robots.txt file and see the issues that restricts Googlebot from crawling the website.

This is a great new addition in the GWT.

Also See:

Google's Take on Sitewide Links
List of Meta Tags Supported by Google
Google Human Quality Raters Do Not Influence A Website Ranking Directly
Google Now Cards
List of Meta Tags
Google Expands Knowledge Graph
Google Disavow Links Tool
Query Highlighting on Google

No comments:

Post a Comment