Inputs
Presets are a starting point — you can edit the rules below.
Each line becomes
Allow:. Leave empty if you don’t need allow overrides.Each line becomes
Disallow:. Use / to block the entire site.If provided, we’ll add a
Sitemap: line at the end.Choose a preset or edit rules, then build your robots.txt.
Output
Tip: Save as
robots.txt at your site root (e.g. https://example.com/robots.txt).robots.txt Generator
A robots.txt file tells search engine crawlers which paths they’re allowed to crawl. It’s a crawl directive (not an access control mechanism).
Common Examples
- Allow all crawlers:
User-agent: *+Disallow: - Block everything:
Disallow: / - Block admin:
Disallow: /admin
About “Block query params”
Some crawlers (notably Google) support pattern matching like * and $ in robots rules.
The preset uses Disallow: /*?* as a practical way to discourage crawling of URLs with query strings.
Always test in your target search engine’s robots.txt tester.