I have a CFBitVector
that looks like '100000000000000'
I pass the byte array to CFBitVectorGetBits
, which then contains the values from this CFBitVector
. After this call, bytes[2]
looks like:
bytes[0] == '0x80'
bytes[1] == '0x00'
This is exactly what I would expect. However, when copying the contents of bytes[2]
to unsigned int bytesValue,
the value is 128
when it should be 32768
. The decimal value 128
is represented by the hex value 0x0080
. Essentially it seems that the byte order is reversed while performing memcpy
. What is going on here? Is this just an issue with endianness?
Thanks
CFMutableBitVectorRef bitVector = CFBitVectorCreateMutable(kCFAllocatorDefault, 16);
CFBitVectorSetCount(bitVector, 16);
CFBitVectorSetBitAtIndex(bitVector, 0, 1);
CFRange range = CFRangeMake(0, 16);
Byte bytes[2] = {0,0};
unsigned int bytesValue = 0;
CFBitVectorGetBits(bitVector, range, bytes);
memcpy(&bytesValue, bytes, sizeof(bytes));
return bytesValue;
What is going on here? Is this just an issue with endianness?
Yes.
Your computer is little endian. The 16-bit value 32768
would be represented in-memory as:
00 80
On a little endian machine. You have:
80 00
Which is the opposite, representing 128 as you're seeing.