🤖 Robots.txt Generator
Generate a robots.txt file to control how search engines crawl your website. Configure rules for different user agents and manage crawling behavior.
Quick Templates
Comments
Rules
Additional Settings
Generated robots.txt
0 characters
How to Use
1. Upload: Place the generated robots.txt file in your website's root directory
2. Access: It should be accessible at https://yourdomain.com/robots.txt
3. Test: Use Google Search Console to test your robots.txt file
4. Monitor: Check crawl stats regularly to ensure proper functioning
Common Patterns
🚫 Block specific directories
Disallow: /admin/ Disallow: /private/
📄 Block file types
Disallow: /*.pdf$ Disallow: /*.doc$
🔍 Allow specific bots
User-agent: Googlebot Allow: /
⏰ Set crawl delay
User-agent: * Crawl-delay: 10