Search code examples
cc89size-t

How to get SIZE_MAX in C89


I'm trying to get SIZE_MAX in C89.

I thought of the following way to find SIZE_MAX:

const size_t SIZE_MAX = -1;

Since the standard (§6.2.1.2 ANSI C) says:

When a signed integer is converted to an unsigned integer with equal or greater size, if the value of the signed integer is nonnegative, its value is unchanged. Otherwise: if the unsigned integer has greater size, the signed integer is first promoted to the signed integer corresponding to the unsigned integer; the value is converted to unsigned by adding to it one greater than the largest number that can be represented in the unsigned integer type 28

With footnote 28:

In a two's-complement representation, there is no actual change in the bit pattern except filling the high-order bits with copies of the sign bit if the unsigned integer has greater size.

This seems like this has defined behavior, but I'm not quite sure if I understand the wording of that paragraph correctly.

Note that this question is explicitly about C89, so this doesn't answer my question because the standard has different wording.

If that doesn't work, the other way I came up with is:

size_t get_size_max() {
    static size_t max = 0;
    if (max == 0) {
        max -= 1U;
    }

    return max;
}

But I couldn't find anything about unsigned integer underflow in the standard, so I'm poking in the dark here.


Solution

  • You could use:

    #ifndef SIZE_MAX
    #define SIZE_MAX ((size_t)(-1))
    #endif
    

    The behaviour of converting -1 to unsigned integer type is defined under section C11 6.3.1.3 "Conversions - Signed and unsigned integers". C89 had an equivalent definition, numbered 3.2.1.2. In fact you quoted the ISO C90 definition 6.2.1.2 in your question (the difference between ANSI C89 and ISO C90 is that the sections are numbered differently).

    I would not recommend using a const variable, since they cannot be used in constant expressions.


    Note: This can't be used in C90 preprocessor arithmetic, which only works on integer constant expressions that contain no casts or words, so we can't use any sizeof tricks. In that case you might need a system-specific definition; there's no standard way for the preprocessor to detect a typedef.