# robots.txt generated at http://scanftree.in
User-agent: *
Disallow:
Disallow: /cgi-bin/
Sitemap: http://www.yoursite.com/sitemap.gz

 

Why do I want a robots.txt?
It saves your bandwidth
It gives you a very basic level of protection
It cleans up your logs
It can prevent spam and penalties associated with duplicate content.
It’s good programming policy.