Martijn Koster presented the robots.txt standard (Robots exclusion standard or Robots exclusion protocol) as part of the W3C www-talk mailing list. The rules defined in the robots.txt file are used to prevent or restrict indexing robots from accessing a website.
Send us your suggestions for interesting websites
Thank you for your tip on website!
Your tip on website wasn't saved. Please try it again!