I was reading about ASP.NET Script Exploits, and one of the suggestions is:
(emphasis is mine; and the suggestion is #3 in section "Guarding Against Scripting Exploits
" in the web page)
If you want your application to accept some HTML (for example, some formatting instructions from users), you should encode the HTML at the client before it is submitted to the server. For more information, see How to: Protect Against Script Exploits in a Web Application by Applying HTML Encoding to Strings.
Isn't that really bad advice? I mean, an exploiter could send the HTML via curl or something similar, and the HTML would then be sent un-encoded to the server, which can't be good(?)
Am I missing something here or mis-interpreting the statement?
Microsoft is not wrong in their sentence, but on the other hand far from complete, and their sentence is dangerous.
Since by default, validateRequest == true, you indeed should encode special HTML characters in the client in order for them to get into the server in the first place and bypass validateRequest.
But - they should have emphasized that this is certainly not a replacement for server side filtering and validation.
Specifically, if you must accept HTML, the strongest advice is to use white-listing instead of black filtering (i.e. allow very specific HTML tags and eliminate all the others). Use of Microsoft AntiXSS library is highly recommended for strong user input filtering. It's far better than "re-inventing the wheel" yourself.