Robots.txt Generator
Free online robots.txt generator, robots txt maker and robots txt file generator. Create robots txt files visually - build robots txt file online with Allow, Disallow and Crawl-delay rules. Validate against Google guidelines. Use as a SEO robots.txt file creator and SEO robots txt file creator with WordPress and Shopify presets. No login required.
Build Your robots.txt File Visually
Rate this tool
Everything the Robots.txt Generator Does
How to Use the Robots.txt Generator
Robots.txt Generator: LazyTools vs Competitors
See how LazyTools compares to other popular tools. Our free robots.txt generator is the only option that combines all key features with no login required and complete browser-side privacy.
| Feature | LazyTools | Google Search Console | Yoast SEO Plugin | seoptimer.com |
|---|---|---|---|---|
| Visual rule builder | Yes | No (text only) | Basic | No |
| Validation warnings | Yes (real-time) | Basic | No | No |
| Platform presets | Yes (6 presets) | No | WordPress only | No |
| Block AI crawlers preset | Yes | No | No | No |
| Multiple user agents | Yes | Yes | Limited | No |
| Sitemap directive | Yes | Yes | Yes | Yes |
| Download file | Yes | No | Yes (via plugin) | Yes |
| No login required | Yes | Requires account | Requires WordPress | Yes |
Robots.txt Syntax and Directives Explained
| Directive | Syntax | Effect | Googlebot support |
|---|---|---|---|
| User-agent | User-agent: * | Specifies which crawler the following rules apply to. * means all crawlers. | Yes |
| Disallow | Disallow: /path/ | Tells the crawler not to crawl this path or URL. Empty value means allow all. | Yes |
| Allow | Allow: /path/ | Creates an exception to a Disallow rule. More specific path wins. | Yes |
| Sitemap | Sitemap: https://example.com/sitemap.xml | Tells crawlers the location of your XML sitemap. Can appear multiple times. | Yes |
| Crawl-delay | Crawl-delay: 10 | Requests crawlers wait N seconds between requests. Googlebot ignores this. | No (use Search Console) |
| # Comment | # This is a comment | Lines starting with # are ignored by crawlers. Useful for documentation. | Ignored |
How Allow and Disallow interact
When an Allow and a Disallow rule conflict, the more specific rule wins (the one with the longer path). For example, if you have Disallow: /admin/ and Allow: /admin/public/, Googlebot will not crawl anything in /admin/ except /admin/public/. If two rules are the same length, Googlebot uses the Allow rule.
What robots.txt cannot do
Robots.txt controls crawling, not indexing. A page blocked in robots.txt can still appear in search results if it is linked from other crawlable pages, because Google can infer the page exists without crawling it. To prevent a page from being indexed, use a noindex meta tag or X-Robots-Tag HTTP header instead. Additionally, robots.txt only applies to well-behaved crawlers that follow the protocol. Malicious bots ignore it entirely.
Robots.txt Templates for WordPress and Shopify
WordPress robots.txt
This robots.txt generator for WordPress produces a recommended configuration. WordPress sites should block admin pages, search results and duplicate parameter URLs while allowing Google to access theme and plugin CSS and JavaScript files. Load the WordPress preset in the generator above to get a fully configured starting point.
Shopify robots.txt
The robots.txt generator for Shopify preset creates a recommended configuration. Shopify automatically generates a robots.txt file for your store. However, you can customise it via the Shopify admin under Online Store > Preferences > robots.txt. Key rules for Shopify block checkout pages, account pages and internal search results. Load the Shopify preset above to see the recommended configuration.
Blocking AI web crawlers
Use this robots txt template to block Googlebot-Extended, GPTBot and other AI training crawlers. AI companies use web crawlers to collect training data. GPTBot (OpenAI), Claude-Web (Anthropic), CCBot (Common Crawl) and other AI crawlers can be blocked using their User-agent names. The generator's Block AI Crawlers preset adds rules for the major AI training crawlers. Note that not all AI crawlers respect robots.txt.