Robots.txt Checker

Validate your robots.txt file by URL or paste. Checks user-agent groups, Disallow/Allow rules, and Sitemap directives.

Fetches /robots.txt from the given origin via a server-side proxy.

Common Use Cases

  1. Pre-launch robots.txt auditBefore going live, paste your robots.txt to confirm that your public pages are crawlable and private paths are correctly blocked.
  2. Debug accidental blocksIf Google Search Console shows crawl errors, fetch your live robots.txt by URL to check whether a Disallow rule is inadvertently blocking important pages.
  3. Validate staging vs productionPaste the staging robots.txt to confirm it blocks all crawlers, then check the production URL to confirm public access is allowed.

Pro plan — coming soon

Save your history, create reusable presets, and share outputs with a link. One plan, all tools.

See what's planned →

Frequently Asked Questions

Related Tools