Related Tools
How to Use
- 1Use a preset (Allow All, Block All Bots, Block AI Bots, WordPress Default) or build from scratch.
- 2Add one or more User-agent rule blocks.
- 3Set Disallow and Allow paths for each agent.
- 4Optionally add Crawl-delay, Sitemap URL, and Host.
- 5Copy the generated robots.txt or download it as a file.
About robots.txt Generator
The robots.txt Generator creates standards-compliant robots.txt files for any website. Configure crawling rules for all bots or specific user agents like Googlebot, GPTBot, or ChatGPT-User.
Includes presets to quickly block AI training bots (GPTBot, ChatGPT-User, Google-Extended, CCBot, anthropic-ai), set up a WordPress default configuration, or allow/block all crawlers with one click.
Frequently Asked Questions
What is robots.txt?
robots.txt is a file at the root of a website that tells search engine crawlers and other bots which pages or files they can or cannot access.
Where should I place robots.txt?
At the root of your domain, e.g. https://example.com/robots.txt. Most web servers serve it automatically from the public folder.
Does Disallow: / block all crawlers?
It instructs bots not to crawl any page, but compliant bots only. Malicious scrapers may ignore it.
Can I block AI training bots?
Yes. Use the "Block AI Bots" preset to add rules blocking GPTBot, ChatGPT-User, Google-Extended, CCBot, and anthropic-ai.