Google is working to improve robots.txt checking tool


Google employee Gary Ilsh confirmed that he is continuing to work on improving the robots.txt checker, which is still only available in the old version of Search Console.

In particular, he wrote on Twitter: “We are currently in the process of integrating the tester with the production version of the parser”.

Ilsh said this in response to a complaint about a possible bug in the way the tool works, or in the way Google reads the robots.txt file.

According to him, the problem is most likely in the instrument itself.


Recall that at the beginning of 2019, Google assured that it does not plan to disable the robots.txt checker tool, and will eventually transfer it to the new version of Search Console. However, this has not yet been done.

According to Ilsch's tweet, working on this tool is not an urgent task.

As a reminder, in September Google opened the source code for a number of robots.txt-related projects.
  • Like
Reactions: ALI


New member
I wonder what else can be improved there. The file robots.txt rules have been working since the days of the dinosaurs.