Robots.txt Generator is a tool that generates robot text files. Robots.txt is a file that tells search engine crawlers which pages or files the crawler can or can't request from your site. This is used mainly to avoid overloading your site with requests; it is not a way for keeping a web page out of Google. You can use this free tool by writing your website URL, then the tool will ask if you want to allow or disallow robots to crawl your site, you can also choose the duration of the delay before google crawlers start crawling your site, then the tool will ask you to type in your sitemap URL (a list of pages of a web site within a domain.) or leave blank if you don't have. After that, Robots.txt Generator will show a list of search engines and will ask if you want to allow or disallow search engine crawlers from crawling your site. Finally, you can type which directories on your site are restricted.