Given the following code snippet:
signed char x = 150;
unsigned char y = 150;
printf("%d %d\n", x, y);
The output is:
-106 150
However, I'm using the same format specifier, for variables that are represented in memory in the same way. How does printf knows whether it's signed or unsigned.
Memory representation in both cases is:
10010110
How does
printf
knows if variable passed signed or unsigned?
The printf
function doesn't "know".
You effectively tell it by using either a signed conversion specifier (d
or i
) or an unsigned conversion specifier (o
, u
, x
or X
).
And if you print a signed integer as unsigned or vice versa, printf
just does what you told it to do.
I used the same specifier "%d", and it printed different values (the positive one and the negative one"
In your example, you are printing signed and unsigned char
values.
signed char x = 150;
The value in x
is -106 (8 bits signed) because 150 is greater than the largest value for char
. (The char
type's range is -128 to +127 with any hardware / C compiler that you are likely to encounter.)
unsigned char y = 150;
The value in y
is 150 (8 bits unsigned) as expected.
At the call site. The char
value -108 is sign extended to a larger integer type. The unsigned char
value 150 is converted without sign extension.
By the time printf
is called, the values that are have been passed to it have a different representation.