Generate robots.txt files to control search engine crawling behavior
Disallow: /admin/ Disallow: /private/
Disallow: /*.pdf$ Disallow: /*.doc$
User-agent: Googlebot Allow: /
User-agent: * Crawl-delay: 10