I have a asp.net download page which send a file to client but I want to deny robots download this file because the file is large and as I can see from the records a bot downloads this file about 20 times. This is slowing down the server and causes bandwidth consumption.
I coded this page to count downloads and detect .net framework of the client so I can post a setup file containing .net framework or not.
I need somehow to deny Google and other bots to reach this page.
My download link is like download.aspx?pack=msp
Yes, add a robots.txt file to your site. It should contain a list of rules (suggestions really) how spiders should behave.
Check out this article for more info. Also for kicks, this is the robot.txt file used by Google.