Search code examples
cformat-specifiers

Faster outputs with a different format specifier


In the below sample

    #include <stdio.h>
    #include <math.h>
    int main(void) {
        int t,n,x,i;
        long int num;
        scanf("%d",&t);
        while(t--){
            scanf("%d",&n);num=0;
            for(i=1;i<=n;i++){
            scanf("%d",&x);
            num=num+(x*pow(10,(n-i)));
            }
            printf("%ld\n",num);
        }
        return 0;
    }

The sample takes a time of 0.03s for a set of input.
When I changed the format specifier from %ld to %d, in printf, the sample took a time of 0.02s for the same value and number of input.

In both cases, num is of type long int and is evaluated in that form. Why does this happen, even though the result is of the same size in both cases ?


Solution

  • How exact are your timing results? If you used the time command, it only gives you two decimals, so the difference between 0.03 and 0.02 is not significant (for example, it might actually have been 0.02500001 and 0.02499999 before rounding). Use a longer input and repeat the test several times, averaging the result, and looking at the standard deviation to see if the difference is statistically significant.

    That said, depending on your platform, int and long int may just as well have the same size, making the two equivalent. If they are not, then it depends on whether the C library on your platform actually uses a different routine to parse long ints compared to normal ints. I would expect that internally it always uses the same routine, the one for the longest possible integer size, and just casts the result to the requested type.