Loading...

Robots.txt Generator — Create robots.txt Free, Block Bots & AI Crawlers

Generate a custom robots.txt file online. Block Googlebot, AI training bots (GPTBot, ClaudeBot), add disallow rules and sitemap URL. Download instantly, no sign-up.

Frequently Asked Questions

What is a robots.txt file?

A robots.txt file is a plain text file at your domain root that tells web crawlers which pages they should not access. It uses the Robots Exclusion Protocol and is respected by all major search engines including Google, Bing, and many AI bots.

Does Disallow in robots.txt remove pages from Google?

No. Disallow stops Googlebot from crawling a URL, but if it is already indexed or linked from elsewhere it can still appear in search results. Use a noindex meta tag to actually remove a page from Google's index.

Can I block AI training bots with robots.txt?

Yes. Add User-agent blocks for GPTBot (OpenAI), ClaudeBot (Anthropic), Google-Extended, and CCBot with Disallow: / to opt out of AI training data. These companies require their bots to respect robots.txt by policy.

Does Googlebot respect Crawl-delay?

No. Googlebot ignores the Crawl-delay directive in robots.txt. To control Googlebot's crawl rate, use the crawl rate settings in Google Search Console under Settings → Crawl stats.