Search code examples
cunsigned-long-long-int

Multiplying very large hex numbers and printing them in C


I want to multiply 2 very big hex numbers and print them out like for example:

28B2D48D74212E4F x 6734B42C025D5CF7 = 1068547cd3052bbe5688de35695b1239

Since I expected it to be a very big number I used unsigned long long int type:

unsigned long long int x = 0x28B2D48D74212E4F;   
unsigned long long int y = 0x6734B42C025D5CF7; 

and print the multiplication like this:

fprintf(stdout, "%llx\n",  x*y);

What I get is exactly the half of the expected result:

5688de35695b1239

Why does it truncate it to exactly the half? Is there something bigger than unsigned long long?


Solution

  • The response you're looking for won't fit in a 64-bit unsigned long long, which is the normal size on a 64-bit platform; any excess during multiply is overflow and dropped.

    Newer versions of GCC do support 128-bit integers on 64-bit machines with __int128 (and unsigned __int128), and this works:

    unsigned long long int x = 0x28B2D48D74212E4FULL;
    unsigned long long int y = 0x6734B42C025D5CF7ULL;
    unsigned __int128 xy = x * (unsigned __int128)y;
    

    Note that you have to cast one of x or y to the wider type so the multiplication is done in 128 bits; otherwise that promotion to 128 is not done until after the (truncated) 64-bit multiply.

    The problem is, as far as I can tell, printf() doesn't have a way to do this easily, so you're going to have to roll your own a bit.

    Some reasonable discussion here: how to print __uint128_t number using gcc?

    But this worked for me on:

    gcc (GCC) 4.8.5 20150623 (Red Hat 4.8.5-39)

    #include <stdio.h>
    
    int main()
    {
    unsigned long long int x = 0x28B2D48D74212E4F;
    unsigned long long int y = 0x6734B42C025D5CF7;
    unsigned __int128 xy = x * (unsigned __int128)y;
    
        printf("Result = %016llx%016llx\n",
            (unsigned long long)( xy >> 64),
            (unsigned long long)( xy & 0xFFFFFFFFFFFFFFFFULL));
    
        return 0;
    

    The casts inside the printf are important: otherwise the shifting/masking are done in 128-bit scalars, and those 128 bits pushed onto the stack, but then each %llx expects 64 bits.

    Note that this is all entirely dependent on the underlying platform and is not portable; there's surely a way to use various #ifdefs and sizeofs to make it more general, but there's probably no super awesome way to make this work everywhere.