Search code examples
cgccsignedness

gcc implicit signedness of constants


I've encountered some interesting behavior with gcc's interpretation of the signedness of constants. I have a piece of code which (greatly simplified) looks like the below:

#define SPECIFIC_VALUE 0xFFFFFFFF

//...

int32_t value = SOMETHING;
if (value == SPECIFIC_VALUE) {
    // Do something
}

When I compile the above, I get the warning: comparison between signed and unsigned integer expressions [-Wsign-compare]

All well and good -- it seems that gcc interprets the hex constant as unsigned, and doesn't like the comparison to a signed integer. However, if I change the define to something like #define SPECIFIC_VALUE 0x7FFFFFFF, the warning goes away. Again, I'm not particularly surprised -- the sign bit being a zero would make gcc happier about interpreting the constant as a signed value. What really surprises me is that if I change the definition to be #define SPECIFIC_VALUE INT32_C(0xFFFFFFFF), I STILL get the warning. I would expect explicitly telling the compiler to interpret my constant as a signed value would silence the warning.


Solution

  • Read C11 § 6.3.1.1 about the conversions applied to integers. § 6.4.4.1 ¶5 specifies the type given for an integer constant. gcc should stick to these rules.

    The hex constant is actually unsigned (by the standard) int (presuming 32 bit integers). So that conforms and is not by chance!

    If you clear the MSbit, the constant can be represented as (signed) int, however. So the comparison goes well. Still standard.

    The third message is lacking the definition of INT32_C, so I cannot help with that. But I think you can solve that yourself now. Just keep in mind, that the error cannot be detected inside the `#define', but only after the macro has been expanded.

    General rule is to either add U to the constant (yes, also for hex) if you really want unsigned. Or cast the constant:

    #define UVALUE ((uint32_t)0x7FFFFFFF)
    

    That would be even better here, as it does not rely on the size of int actually.