robots.txt Validator
Lint your robots.txt for syntax errors, missing user-agent groups and bad sitemap URLs.
Quick answer: Lint your robots.txt for syntax errors, missing user-agent groups and bad sitemap URLs.
Last updated
Frequently asked questions
- Is the validator free?
- Yes — free with no signup.
- Does it match Google's parser?
- It implements the Google robots.txt RFC subset (User-agent, Allow, Disallow, Sitemap, Crawl-delay, Host) — the rules every modern crawler honors.
- Are uploads sent anywhere?
- No — every check runs in your browser.
- Why does it warn about Crawl-delay?
- Google ignores Crawl-delay; Bing and Yandex still honor it. The warning is informational.
- What about wildcards?
- Wildcards (*) and end-of-string ($) are accepted in Allow/Disallow patterns — that's the modern Google syntax.