I am creating an ASP.NET website witch forces users to accept a disclaimer. When they accept the disclaimer a cookie and session is set. On each page request a check is fired to see if the session or cookie present.
We want to allowed Google and other Searchbots to index/craw all the pages without accepting the disclaimer.
What is to best way to do this? The only thing that I can think about is a check in the Request.ServerVariables
, but I am not sure witch values I should look for?
You need to bypass the check for the session/coockie in case Googlebot is passing by.
So in this check you indeed need to look at ServerVariables. E.g. for Google this would be something like
HttpContext.Current.Request.ServerVariables["HTTP_USER_AGENT"].Contains("Googlebot"))
Of course, anyone pretending to be google will now also be able to skip the disclaimer.
I wouldn't call this cloacking - but no guarantees as to wether google likes it or not