Back to SEO Strategy & Optimization
SEO Strategy & Optimization
Robots.txt Generator
Configure crawl rules for search engines. Control which pages bots can access, set crawl delays, and specify sitemap locations.
Robots.txt Generator
Configure crawl rules for search engines. Control which pages bots can access, set crawl delays, and specify sitemap locations.
Crawl Rules
Rule 1
Sitemap URLs
Generated robots.txt
User-agent: *
Best Practices
- • Use
*for all bots or specify individual bots like Googlebot - • Disallow sensitive paths:
/admin/,/api/ - • Use crawl-delay cautiously - most major bots ignore it
- • Place your robots.txt file at the root:
example.com/robots.txt - • Test your robots.txt with Google Search Console
- • Add all sitemap URLs to help search engines discover your content
Learn More in The Course
This tool is derived from Module 1 of "The Ultimate Growth Engineering Course." Learn the complete experimentation framework, how to design better tests, and interpret results like a pro.
Explore the Course