في هذه الصفحة يمكنك الحصول على تحليل مفصل لكلمة أو عبارة باستخدام أفضل تقنيات الذكاء الاصطناعي المتوفرة اليوم:
robots.txt is a standard used by websites to indicate to visiting web crawlers and other web robots which portions of the website they are allowed to visit.
This relies on voluntary compliance. Not all robots comply with the standard; email harvesters, spambots, malware and robots that scan for security vulnerabilities may even start with the portions of the website where they have been told to stay out.
The "robots.txt" file can be used in conjunction with sitemaps, another robot inclusion standard for websites.