• Increase font size
  • Default font size
  • Decrease font size

Robots Exclusion Protocol

Robots Exclusion Protocol

The Robots Exclusion Protocol, or Robots Exclusion Standard was created by consensus bij members of the robots mailing list in 1994, there is no official RFC created for it.

The information specifying the parts that should not be accessed is specified in a file called robots.txt in the top-level directory of the website

The robot exclusion standard, also known as the Robots Exclusion Protocol or robots.txt protocol, is a convention to prevent cooperating web-spiders and other webrobots  from accessing all or part of a website which is otherwise publicly viewable. Robots are often used by search engines to categorize and archive web sites, or by webmasters to proofread source code. The standard complements Sitemaps, a robot inclusion standard for websites.




Main Menu