I am trying to understand how the putchar('0' + r);
works. Below, the function takes an integer and transform it to binary.
void to_binary(unsigned long n)
{
int r;
r = n % 2;
if (n >= 2)
to_binary(n / 2);
putchar('0' + r);
}
I google the definition of putchar but I didn't find this. To test it, I added a printf to see the value of the r:
void to_binary(unsigned long n)
{
int r;
r = n % 2;
if (n >= 2)
to_binary(n / 2);
printf("r = %d and putchar printed ", r);
putchar('0' + r);
printf("\n");
}
and I run it (typed 5) and got this output:
r = 1 and putchar printed 1
r = 0 and putchar printed 0
r = 1 and putchar printed 1
So I suppose that the putchar('0' + r);
prints 0 if r=0, else prints 1 if r=1, or something else happens?
In C '0' + digit
is a cheap way of converting a single-digit integer into its character representation, like ASCII or EBCDIC. For example if you use ASCII
think of it as adding 0x30 ('0'
) to a digit.
The one assumption is that the character encoding has a contiguous area for digits - which holds for both ASCII and EBCDIC.
As pointed out in the comments this property is required by both the C++ and C standards. The C standard says:
5.2.1 - 3
In both the source and execution basic character sets, the value of each character after 0 in the above list of decimal digits shall be one greater than the value of the previous.