standard for robot exclusion - Definition. Was ist standard for robot exclusion
Diclib.com
Wörterbuch ChatGPT
Geben Sie ein Wort oder eine Phrase in einer beliebigen Sprache ein 👆
Sprache:

Übersetzung und Analyse von Wörtern durch künstliche Intelligenz ChatGPT

Auf dieser Seite erhalten Sie eine detaillierte Analyse eines Wortes oder einer Phrase mithilfe der besten heute verfügbaren Technologie der künstlichen Intelligenz:

  • wie das Wort verwendet wird
  • Häufigkeit der Nutzung
  • es wird häufiger in mündlicher oder schriftlicher Rede verwendet
  • Wortübersetzungsoptionen
  • Anwendungsbeispiele (mehrere Phrasen mit Übersetzung)
  • Etymologie

Was (wer) ist standard for robot exclusion - definition

STANDARD USED TO ADVISE WEB CRAWLERS AND SCRAPERS NOT TO INDEX A WEB PAGE OR SITE
Robots exclusion standard; Robots.txt protocol; Robots exclusion file; Robots exclusion protocol; Standard for Robot Exclusion; Robot Exclusion Standard; Robot Exclusion Protocol; Robot.txt; Robots Exclusion Standard; Robots.tx; Robot exclusion standard; ROBOTS.TXT; Humans.txt; Killer-robots.txt

standard for robot exclusion         
<World-Wide Web> A proposal to try to prevent the havoc wreaked by many of the early World-Wide Web robots when they retrieved documents too rapidly or retrieved documents that had side effects (such as voting). The proposed standard for robot exclusion offers a solution to these problems in the form of a file called "robots.txt" placed in the {document root} of the web site. {W3C standard (http://w3.org/TR/html4/appendix/notes.html#h-B.4.1.1)}. (2006-10-17)
robots.txt         
Robots exclusion standard         
The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots.

Wikipedia

Robots.txt

robots.txt is a standard used by websites to indicate to visiting web crawlers and other web robots which portions of the website they are allowed to visit.

This relies on voluntary compliance. Not all robots comply with the standard; email harvesters, spambots, malware and robots that scan for security vulnerabilities may even start with the portions of the website where they have been told to stay out.

The "robots.txt" file can be used in conjunction with sitemaps, another robot inclusion standard for websites.