I know that the number of bits in a char
is defined in CHAR_BIT
from <limits.h>
and that sizeof(char)
is always 1. But what about other basic datatypes, are their sizes defined relative to CHAR_BIT
?
For example the minimum size of an int
in C is defined as 16 bits on Wikipedia, but on other sites like GeeksforGeeks the minimum size is defined as 2 bytes. Which definition is correct as a byte is not necessarily the same as 8 bits.
The minimum required range of values of a type is defined numerically, not in terms of bits or bytes. 5.2.4.2.1 Sizes of integer types <limits.h>
says has definitions like:
- minimum value for an object of type
short int
SHRT_MIN
-32767 // -(215 - 1)- maximum value for an object of type
short int
SHRT_MAX
+32767 // 215 - 1
All the type definitions happen to match up with implementing the types using either two's complement or sign-magnitude representation with a multiple of 8 bit bytes. But since they're just minimums, it doesn't require such representations.