Home/Guides/How to Test and Fix robots.txt

How to Test and Fix robots.txt

robots.txt tells search engines which paths they may or may not request. A wrong rule can block important pages or leave private areas open. Our robots.txt Tester shows exactly how a given URL and User-agent are treated.

Common causes

  • Disallow: / blocks the whole site instead of one path.
  • Wrong order of rules (more specific rules can be overridden).
  • Typos in User-agent or path.
  • Missing Sitemap line.

How to fix

  • Enter your robots.txt URL and a test path in the tester.
  • Use specific paths (e.g. Disallow: /admin/) instead of Disallow: /.
  • Add Sitemap: https://yoursite.com/sitemap.xml.
  • Test with User-agent: Googlebot to match real crawling.
All guides