Robots.txt + Sitemap Validator
Validate your robots.txt and sitemap.xml by URL or paste. Parses user-agent groups, Disallow/Allow rules, and sitemap URL counts.
Requests proxied to bypass browser CORS restrictions; your input is not stored.
Fetches /robots.txt from the given origin via a server-side proxy.
Common Use Cases
- Validate robots.txt before deployingPaste your site URL or raw robots.txt to check for disallowed crawl rules that might accidentally block search engines from indexing important pages.
- Audit a competitor's crawl rulesEnter any public domain to fetch and parse its robots.txt, revealing which directories are blocked and which sitemaps are declared.
- Verify sitemap URL countFetch a sitemap.xml to confirm the number of URLs declared and check for malformed entries before submitting to Google Search Console.
Pro plan — coming soon
Save your history, create reusable presets, and share outputs with a link. One plan, all tools.
See what's planned →Frequently Asked Questions
Related Tools
Open Graph / Social Card Preview
Extract Open Graph and Twitter Card meta tags from any URL or HTML snippet. See a visual preview of how your page appears when shared on social platforms.
Redirect Checker
Follow HTTP redirect chains for any URL. See every hop with status code, Location header, and timing. Debug redirect loops, broken chains, and final destinations.
UTM Builder
Build UTM-tagged URLs for campaign tracking. Fill in source, medium, and campaign to generate a live tracking link. Paste a pre-tagged URL to extract and edit its parameters.