I am going through our code and doing some much needed house cleaning. Many of our errors are dealing with variables that are not correctly scoped. In doing so, and going through error logs/emails, I noticed a larger majority of our errors are caused by Bots/Spiders. I don't think Bots/Spiders use session variables, (I honestly haven't looked it up) but even if they do/don't, is there anything that would cause Bots/Spiders to cause an application to produce more errors.
I have fixed some where spiders/bots are passing in bad/null URL variables. I get that, but just on normal page processing I see, as a whole, more errors from bots than from humans. I am not blaming them, (I know it's our code), but I am curious as to why this would be.
My assumption is that our errors dealing with non-scoped variables a due possibly to threading with those variables. But a bot shouldn't be able to affect that. Or should it?
You've nailed most of it already. Mostly bots do things the way bots want to do them. Access pages directly, from a different link, some bots even submit forms with junk to see what's on the other side. They may access a page differently than you would expect so variables or objects may not exists the way they are intended for human consumption.
Imagine entering your house through the bedroom window (an unsecured entry behind the login screen).
Great you're in and you can walk around the house (navigate the website).
Until the alarm goes off because it was never disabled when you walked in the front door because you skipped that step (You didn't log-in).
Now your alarm is going off (admin is getting an email).
You can't leave the house through the front door because the deadbolt requires a key on the inside which you don't have (you can't log-out because you don't have a session).
So now you try to leave through the back door but the guard dog is out and you have to deal with a 6' fence (random exit strategy that will end up causing more errors).