I've been getting a lot of errors sent to my e-mail due to web crawlers hitting parts of my site without any request data, and I was wondering what is the best way to handle web crawlers in Django? Should I issue a redirect when I come across an empty QueryDict?
You could consider implementing a robots.txt to disallow crawlers from accessing areas of your site that are intended for humans only, such as forms.