Due to the way conversions and operations are defined in C, it seems to rarely matter whether you use a signed or an unsigned variable:
uint8_t u; int8_t i;
u = -3; i = -3;
u *= 2; i *= 2;
u += 15; i += 15;
u >>= 2; i >>= 2;
printf("%u",u); // -> 2
printf("%u",i); // -> 2
So, is there a set of rules to tell under which conditions the signedness of a variable really makes a difference?
It matters in these contexts:
-2/2 = 1
, -2u/2 = UINT_MAX/2-1
, -3%4 = -3
, -3u%4 = 1
>>
and <<
are implementation defined or undefined, resp. For unsigned values, they are always defined.-2 < 0
, -2u > 0
x+1 > x
may be assumed by the compiler to be always true iff x
has signed type.