I'm reading the C programming textbook1 and in 1.6 they're saying the conversion between a char containing a digit and an int can be done like this:
char character = '7';
int integerChar = character - '0';
I'm having trouble understanding what is happening here and why the integer value is equal to the character minus the character '0'
.
1 Brian W Kernighan and Dennis M Ritchie The C Programming Language, 2nd Edn 1988.
From the C Standard (5.2.1 Character sets)
- ...In both the source and execution basic character sets, the value of each character after 0 in the above list of decimal digits shall be one greater than the value of the previous.
So it means that the Standard guarantees that for example the difference '1' - '0'
is equal to 1
, or the difference '2' - '0'
is equal to 2
and so on.
So independent of the internal representation of characters as for example ASCII or EBCDIC you can get an integer digit that is represented by a character c
like c - '0'
.
For example in the ASCII character table the characters '0'
through '9'
have codes from 48
up to 57
. In the EBCDIC character table they have codes from 240
up to 249
.
Here is a demonstration program.
#include <stdio.h>
int main( void )
{
for (char c = '0'; c <= '9'; c++)
{
printf( "'%c': %d\n", c, c - '0' );
}
}
The program output is
'0': 0
'1': 1
'2': 2
'3': 3
'4': 4
'5': 5
'6': 6
'7': 7
'8': 8
'9': 9