Robots.txt Generator | Web Price Calculator

Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

Robot.txt is the file on website root directory which is required to communicate with the web crawler robots. Robot.txt defines the protocol about which directory or sub path of the website should not be indexed. E.g. If you don’t want your web subdirectory to be scanned by web robots like http://webpricecalculator.com/blog then you can do it with robot.txt.

 

How does Robot.txt Generator Tool work?

  • First choose default value for given search engine robots. Default values are of two types:-

    • Allowed: - It means all links are allowed to scan with no restriction.

    • Refused: - It means given restricted links shall not to be scanned by search engine robots.

  • Then choose crawling delay in seconds if you want any delay in crawling.

  • Write sitemap URL of your website in textbox .(This is optional , you can leave it)

  • Then you can choose separate options for each search engine robots.

  • Then write down the multiple restricted directory paths.

  • Verify the captcha as given in Image.

  • Click on the green button “Create Robot.txt” & then sit back and get your content for robot.txt in result.


Popular SEO Tools