User-agent Directive in Robots.txt
The User-agent directive in the robots.txt
file specifies the search engine bot or group of bots to which the listed rules apply. It is case-insensitive, meaning strings like User-agent: Yandex
or user-agent: Yandex
are treated the same.
![]() |
How the User-agent Directive Works in Yandex |
When a specific bot, such as Yandex
, is mentioned, its rules override the more general User-agent: *
directives. This allows for tailored instructions for specific bots while maintaining a fallback for all others. For instance, if User-agent: Yandex
is defined, the rules under User-agent: *
are ignored for Yandex bots.
If no User-agent
directives are present, bots assume unrestricted access to the website. Thus, it is crucial to define these directives clearly to manage crawling effectively and ensure sensitive or irrelevant sections of your site are appropriately restricted.
Example robots.txt File
The following shows the robots.txt
directives for specific user agents:
User-agent: YandexBot # will only be used by the main indexing robot
Disallow: /*id=
User-agent: Yandex # will be used by all Yandex robots
Disallow: /*sid= # except the main indexer
User-agent: * # will not be used by Yandex robots
Disallow: /cgi-bin