Execution of:
#define HIGH32(V64) ((uint32_t)((V64 >> 32)&0xffFFffFF))
#define LOW32(V64) ((uint32_t)(V64&0xffFFffFF))
uint32_t a = 0xffFFffFF;
uint32_t b = 0xffFFffFF;
uint64_t res = a * b;
printf("res = %08X %08X\n", HIGH32(res), LOW32(res));
Gives:
"res = 00000000 00000001"
But I expect: fffffffe00000001. What did I do wrong? Single assignment:
res = 0x0123456789ABCDEF;
printf("res = %08X %08X\n", HIGH32(res), LOW32(res));
Gives
res = 01234567 89ABCDEF
Environment:
$gcc --version
gcc (GCC) 4.8.3
Copyright (C) 2013 Free Software Foundation, Inc.
$ gcc -v
COLLECT_GCC=gcc
COLLECT_LTO_WRAPPER=/usr/lib/gcc/i686-pc-cygwin/4.8.3/lto-wrapper.exe
dest arch: i686-pc-cygwin
$ file a.exe
a.exe: PE32 executable (console) Intel 80386, for MS Windows
You currently have:
uint64_t res = (uint32_t) a * (uint32_t) b;
You will want to promote the arguments to 64bit numbers before the multiplication. Therefore:
uint64_t res = (uint64_t) a * b;