Robots.txt Checker
Validate your robots.txt file by URL or paste. Checks user-agent groups, Disallow/Allow rules, and Sitemap directives.
Fetches /robots.txt from the given origin via a server-side proxy.
Common Use Cases
- Pre-launch robots.txt auditBefore going live, paste your robots.txt to confirm that your public pages are crawlable and private paths are correctly blocked.
- Debug accidental blocksIf Google Search Console shows crawl errors, fetch your live robots.txt by URL to check whether a Disallow rule is inadvertently blocking important pages.
- Validate staging vs productionPaste the staging robots.txt to confirm it blocks all crawlers, then check the production URL to confirm public access is allowed.
Pro plan — coming soon
Save your history, create reusable presets, and share outputs with a link. One plan, all tools.
See what's planned →Frequently Asked Questions
Related Tools
Open Graph / Social Card Preview
Extract Open Graph and Twitter Card meta tags from any URL or HTML snippet. See a visual preview of how your page appears when shared on social platforms.
Redirect Checker
Follow HTTP redirect chains for any URL. See every hop with status code, Location header, and timing. Debug redirect loops, broken chains, and final destinations.
UTM Builder
Build UTM-tagged URLs for campaign tracking. Fill in source, medium, and campaign to generate a live tracking link. Paste a pre-tagged URL to extract and edit its parameters.