I'm really getting frustrated here. Trying to implement the CRC-CCITT algorithm and I found a very nice example on an Internet site.
There is one line whose output I completely don't understand:
unsigned short update_crc_ccitt( unsigned short crc, char c){
[...]
short_c = 0x00ff & (unsigned short) c;
[...]
}
I want to calculate the CRC of the "test"
string "123456789"
. So in the first run the char 'c' is 1. From my understanding short_c
from the first run should be equal to 1
as well, but when I print it to the console, I get short_c = 49
for c = 1
. How?
0x00ff in binary is: 1 1 1 1 1 1 1 1
char 1 in binary is: 0 0 0 0 0 0 0 1
bitand should be : 0 0 0 0 0 0 0 1
Where is my mistake?
The character 1
has ASCII code 0x31 = 49. This is different from the character with ASCII code 1 (which is ^A
).