What is the mathematical formula to calculate the range of signed, unsigned, short and long data types in ANSI C?
unsigned types have a range from 0
to 2^(effective number of bits used by the type) - 1
signed types have a implementation defined minimum:
2's complement -(2^(effective number of bits used by the type - 1))
all others -(2^(effective number of bits used by the type - 1) - 1)
the maximum for signed types is 2^(effective number of bits used by the type - 1) - 1
^
is the power function, not the xor.