This one is simple.
printf("%lu\n", (unsigned long)(320 * 200));
That line of code prints out "4294965760". That is definitely NOT equal to 360 * 200. What's wrong with it?
I am using Digital Mars C compiler in 16 bit medium memory model.
On a 16-bit system, if sizeof(int) == 2
, then 320 * 200
, which is equal to 64000
, is too big for a signed int
(range ±32,767 — and usually -32,768 too, but in theory it varies by platform). So, you have arithmetic overflow. The cast doesn't affect the multiplication; it only affects what happens to the result of the multiplication.
You'd do better with:
printf("%lu\n", 320UL * 200UL);
That forces the constants to unsigned long
.