While playing around with pointers I came around something interesting.
I initialized a 16 bit unsigned int variable to the number 32771 and then assigned the address of that variable to an 8 bit unsigned int pointer.
Now 32771, in unsigned 16 bit form, has binary representation of 110000000000001. So when dereferencing the 8 bit pointer the first time, I expected it to print the value of 11000000, which is = 192 and after incrementing the pointer and then dereferencing the pointer again, is expected it to print the value 00000001, which is 128.
In actuality, for the first dereference, 3 was printed, which is what I would get if I read 11000000 from left to right and the second dereference printed 128.
int main(){
__uint16_t a = 32771;
__uint8_t *p = (__uint8_t *)&a;
printf("%d", *p); //Expected 192 but actual output 3.
++p;
printf("%d", *p); //Expected 1, but actual output 128
}
I know that bits are read from right to left, however in this case the bits are being read from left to right. Why?
32771 is 32768 plus 3. In binary, it's 1000000000000011.
If you split it into the most significant byte and the last significant byte, you get 128 and 3 because 128 * 256 + 3 = 32771.
So the bytes will be 3 and 128. There's no particular reason one should occur before the other. Your CPU can store the bytes that compose a multi-byte number in whatever order it wants to. Apparently, yours stores the least significant byte at a lower address than the most significant byte.