SEO Tool
Build a valid robots.txt file with allow/disallow rules, crawl-delay settings, and sitemap references. Use preset buttons to quickly block AI crawlers or allow all bots. Copy-ready output.
User-agent: * Disallow: /admin/ Disallow: /api/ Allow: /
Upload this file to your domain root: https://yourdomain.com/robots.txt
Add your sitemap URL so search engines can discover all your pages efficiently.
Configure rules: set user-agent (use * for all bots), choose Allow or Disallow, and enter the path.
Use preset buttons: 'Block AI Crawlers' adds rules for GPTBot, CCBot, ClaudeBot, and others.
Set an optional crawl-delay to throttle how fast crawlers access your site.
Review the live preview on the right — it updates in real time as you modify rules.
Copy the output and save it as 'robots.txt' in your website's root directory.
A robots.txt file is a plain text file placed at the root of your website (e.g. example.com/robots.txt) that tells search engine crawlers which pages or sections they can and cannot access. It uses the Robots Exclusion Protocol — a standard that all major search engines (Google, Bing, Yandex) follow. Note that robots.txt is advisory, not enforced — malicious crawlers may ignore it. For sensitive content, use proper authentication instead.
To block AI training crawlers, add specific User-agent rules for each AI bot: GPTBot (OpenAI), CCBot (Common Crawl), anthropic-ai and ClaudeBot (Anthropic), Google-Extended (Google AI training), and Bytespider (ByteDance/TikTok). Set 'Disallow: /' for each. Note that blocking Google-Extended only prevents AI training use — regular Google Search crawling (Googlebot) remains unaffected.
The Crawl-delay directive tells crawlers to wait a specified number of seconds between requests. For example, 'Crawl-delay: 10' means the crawler should wait 10 seconds between each page fetch. Google does not officially support Crawl-delay (use Google Search Console's crawl rate settings instead), but Bing, Yandex, and other crawlers do respect it. It's useful for protecting servers with limited resources.
Canvas Builder generates production-ready HTML with proper meta tags, semantic structure, and SEO best practices built in.
Try Canvas Builder