I'm fairly new to c/c++, and I'm trying to follow some guidelines which suggest using the stdint.h defined types where possible (uint8_t
, etc instead of unsigned char
).
However, it seems like when you're calling an API which expects a char*
buffer (such as recv
), you have to use whatever type that API dictates.
What I don't understand is why, if you're reading bytes, wouldn't you want it to be unsigned, so you get values between 0 and 255 instead of -128 and 127.
I don't know if this is implementation-specific, but I've only ever seen char
default to signed.
Are you expected to cast the results of calls like this to an unsigned char*
or uint8_t*
to allow you to interpret the higher positive values?
On Windows, recv
is part of the Winsock API, which originated as a clone of the 4.2BSD sockets API. 4.2BSD predates ANSI C and the void
keyword, so it uses char *
because it was the closest thing to a generic pointer at the time.
Later, after void
was invented, BSD and other Unix systems updated the definition of recv
, and now they all use void *
. The motivation for this change is clear: char *
isn't really a generic pointer, using it as one makes your code uglier, with more casts. The void *
version of recv
mandated by POSIX too, so the only OS vendor that doesn't have it is the one that cares the least about POSIX compliance and code aesthetics... Microsoft.