I have unwanted traffic on my website. for example.
This type of traffic dont give me any benefit rather it is using bandwidth.
I want to know how to detect this type of traffic on my website and stop them.
If i come to know the ip address or referer url then i block that ip or referer url from htaccess. but in that case i have add ip or referer url each time. i want it automatic. something like it detect or identify the spam hits and block them.
Please add robots.txt to your web site root to prevent most automatic crawlers from accessing you site.
You could also come up with a rule that redirects access to error page if Referrer is set (client comes through a link, search engine or such). This only allows direct access to the web site.
The real answer: do not bother