standard for robot exclusion (web) A proposal to try to prevent the havoc wreaked by many of the early web robots when they retrieved documents too rapidly or retrieved documents that had side effects (such as voting). The proposed standard for robot exclusion offers a solution to these problems in the form of a file called "robots.txt" placed in the document root of the website.
W3C standard. Last updated: 2006-10-17