Free Robots.txt Generator

Free Digital And SEO Tools

Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

Robots.txt Generator is a tool that generates robot text files. Robots.txt is a file that tells search engine crawlers which pages or files the crawler can or can't request from your site. This is used mainly to avoid overloading your site with requests; it is not a way for keeping a web page out of Google. You can use this free tool by writing your website URL, then the tool will ask if you want to allow or disallow robots to crawl your site, you can also choose the duration of the delay before google crawlers start crawling your site, then the tool will ask you to type in your sitemap URL (a list of pages of a web site within a domain.) or leave blank if you don't have. After that, Robots.txt Generator will show a list of search engines and will ask if you want to allow or disallow search engine crawlers from crawling your site. Finally, you can type which directories on your site are restricted.