I have tried to multiply to numbers i.e. 10000
and 10000 + 1
through C program. But I am not getting the correct output.
printf("%lld",(100000)*(100001));
I have tried the above code on different compilers but I am getting same 1410165408
instead of 10000100000
.
Well, let's multiply
int64_t a = 100000;
int64_t b = 100001;
int64_t c = a * b;
And we'll get (binary)
1001010100000011010110101010100000 /* 10000100000 decimal */
but if you convert it to int32_t
int32_t d = (int32_t) c;
you'll get the last 32 bits only (and throw away the top 10
):
01010100000011010110101010100000 /* 1410165408 decimal */
A simplest way out, probably, is to declare both constants as 64-bit values (LL
suffix stands for long long
):
printf("%lld",(100000LL)*(100001LL));