Search code examples
ctypescastingunsignedsigned

signed and unsigned integer in C


I have wrote this program as an exercise to understand how the signed and unsigned integer work in C. This code should print simply -9 the addition of -4+-5 stored in variable c

#include <stdio.h>

int main (void) {
  unsigned int a=-4;
  unsigned int b=-5;

unsigned int c=a+b;
printf("result is %u\n",c);

return 0;
}

When this code run it give me an unexpected result 4294967287. I also have cast c from unsigned to signed integer printf ("result is %u\n",(int)c); but also doesn't work.

please someone give explanation why the program doesn't give the exact result?


Solution

  • That answer is precisely correct for 32-bit ints.

    unsigned int a = -4;
    

    sets a to the bit pattern 0xFFFFFFFC, which, interpreted as unsigned, is 4294967292 (232 - 4). Likewise, b is set to 232 - 5. When you add the two, you get 0x1FFFFFFF7 (8589934583), which is wider than 32 bits, so the extra bits are dropped, leaving 4294967287, which, as it happens, is 232 - 9. So if you had done this calculation on signed ints, you would have gotten exactly the same bit patterns, but printf would have rendered the answer as -9.