I am watching Jerry Cain's Programming Paradigms Lecture 3 video where the effect of an element assignment after casting between an int array and short array is demonstrated. Essentially the argument is that if you were to assign an int array element arr[3] = 128
, then temporarily cast the int array to a short* and assign arr[6] = 2
, then arr[3] should become 128 + 512 = 640 because the 2 would be interpreted as being in the 2^9th position. Code to demonstrate:
#include <stdio.h>
int main() {
printf("sizeof(int) is %lu\n", sizeof(int));
printf("sizeof(short) is %lu\n", sizeof(short));
int arr[5];
arr[3] = 128;
((short*)arr)[6] = 2;
printf("arr[3] is equal to %d\n", arr[3]); //expect 640, get 2 instead
return 0;
}
When I run this code though, I get the following output:
sizeof(int) is 4
sizeof(short) is 2
arr[3] is equal to 2
I expect arr[3] to be equal to 640, but instead it is simply equal to 2. I am admittedly a C noob - can anyone explain?
Big-endian vs little-endian, I think.
The code is inherently platform-specific (officially, almost certainly undefined behaviour). I'm not sure you should be being taught it, but that is, I guess, an issue for another time.
The 2 is assigned to two of the four bytes of arr[3]
. If you assigned to ((short *)arr)[7]
instead, you might see the expected result.
What machine are you testing on (what type of CPU)?
On second thoughts - although part of the issue is perhaps big-endian vs little-endian, the other problem is short
vs char
. Here's some more code that shows various pathways to the solution:
#include <stdio.h>
int main(void)
{
printf("sizeof(int) is %lu\n", sizeof(int));
printf("sizeof(short) is %lu\n", sizeof(short));
int arr[5];
arr[3] = 128;
((short*)arr)[6] = 2;
printf("arr[3] is equal to %8d (0x%08X)\n", arr[3], arr[3]);
arr[3] = 128;
((short*)arr)[7] = 2;
printf("arr[3] is equal to %8d (0x%08X)\n", arr[3], arr[3]);
for (int i = 12; i < 16; i++)
{
arr[3] = 128;
((char *)arr)[i] = 2;
printf("arr[3] is equal to %8d (0x%08X) i = %d\n", arr[3], arr[3], i);
}
return 0;
}
The output of this revised code is:
sizeof(int) is 4
sizeof(short) is 2
arr[3] is equal to 2 (0x00000002)
arr[3] is equal to 131200 (0x00020080)
arr[3] is equal to 2 (0x00000002) i = 12
arr[3] is equal to 640 (0x00000280) i = 13
arr[3] is equal to 131200 (0x00020080) i = 14
arr[3] is equal to 33554560 (0x02000080) i = 15
Testing on MacOS X 10.7.2 with GCC 4.2.1 XCode 4.2 (LLVM).