robots.txt Generator
Build a valid robots.txt with disallow groups, sitemap, crawl-delay and AI-bot blocking.
Quick answer: Build a valid robots.txt with disallow groups, sitemap, crawl-delay and AI-bot blocking.
Last updated
Frequently asked questions
- How do I create a robots.txt?
- Add a user-agent group and the paths to disallow. The file builds live below; copy or download it as robots.txt.
- Where do I put robots.txt?
- At the root of your domain — `https://example.com/robots.txt`. Bots only check that exact location.
- What does User-agent: * mean?
- Rules apply to every crawler that doesn't have its own group. Specific groups (like `Googlebot`) override the wildcard.
- How do I block AI crawlers?
- Tick the AI crawler checkboxes — we add a Disallow: / group for each one. This is the standard polite signal for bots that respect robots.txt.
- Do all bots obey robots.txt?
- No — it's a request, not enforcement. Reputable bots (search engines, Anthropic, OpenAI) honour it; malicious scrapers ignore it.
- What's the order of Allow vs Disallow?
- Most modern crawlers use the most specific match. Putting Allow before Disallow is conventional but order doesn't change semantics for them.
- Should I add Crawl-delay?
- Only if you have evidence a bot is hammering your server. Most major crawlers ignore Crawl-delay (Google does); it's mainly useful for Yandex and Bing.
- Do I need a Sitemap line?
- Optional but recommended — it helps crawlers discover URLs faster than crawling alone.
- Is the file sent to a server?
- No. Generation runs entirely in your browser.
- Is this robots.txt generator free?
- Yes — free, no signup, no limits.