I've placed the following Header
in my vhost config:
Header set X-Robots-Tag "noindex, nofollow"
The goal here is to just disable search engines from indexing my testing environment. The site is Wordpress and there is a plugin installed to manage per-page the meta robots settings. For example:
<meta name="robots" content="index, follow" />
So my question is, which directive will take precedence over the other since both are being set on every page?
I am not sure if a definitive answer can be given to the question, as the behavior may be implementation-dependent (on the robot side).
However, I think there is reasonable evidence that X-Robots-Tag
will take precedence over <meta name="robots" ...
. See :
One significant difference between the X-Robots-Tag
and the robots
meta directive is:
X-Robots-Tag
is part of the HTTP protocol header.<meta name="robots" ...
is part of the HTML document header.Therefore the the X-Robots-Tag
belongs to HTTP protocol layer, while <meta name="robots" ...
belongs to the HTML protocol layer.
As they belong to a different protocol layer, they will not be parsed simultaneously by the (robot) client getting the page: The HTTP layer will be parsed first, and the HTML in a later step.
(Also, it should be noted that X-Robots-Tag
and <meta name="robots" ...
are not suppported by all robots. Google and Yahoo/Bing suppport both, but according to this some support only <meta name="robots" ...
, others support neither.)
Summary :
X-Robots-Tag
will be processed first ; restrictions (noindex, nofollow) apply (and <meta name="robots" ...
is ignored).<meta name="robots" ...
directive applies.