I'm having trouble to understand how multibyte character are represented in the ascii table : decimal format and then in hexadecimal.
For instance:
char *c = "é";
printf("%d\n%d", c[0], c[1]);
It will display :
-61
-87
In the ascii table, "é" in decimal is 130, and 82 in hex. I understand 82 is the hexadecimal value of 130, but how can we obtain 130 from -61 & -87 ?
Thanks in advance and sorry for my spelling
According to UTF-8 charset (used, among other, by many GNU/Linux distributions), the value of 'é'
character constant is 0xC3A9
, which is equivalent to 11000011 10010101
in binary. Here we can understand the results, assuming two complement representation.
11000011
is equal to -61
in decimal. 10010101
is equal to -87
in decimal.