I want to prevent users from accessing my robots.txt file but I still want search engines to read it. Is it possible? If yes then how do I do it? I believe if I write following in .htaccess it will work but I am afraid it will also block search engines from accessing it.
order deny, allow deny from all
Thanks
Since standard robots.txt
is served from the root of your domain unless you can somehow reliably distinguish search engines from users I don't think what you are asking is possible.
You could try filtering by user agent or possibly by IP range.
Is there a reason why you don't want your users to not see what is in your robots.txt file? After all everything in that file is public.