Search code examples
cvariablesundefined-behaviorunsigned-long-long-int

unsigned long long behaving odd in C


I stumbled across a very odd issue when running a program compiled for MIPS. The following snippet of code is to get the time epoch and store it in microsecond-precision in an unsigned long long variable.

The variable is capable of storing 8 bytes which I checked with sizeof(unsigned long long).

This code prints out weirdly:

unsigned long long microtime=0;
struct timeval time_camera = { .tv_sec=0, .tv_usec=0 };
gettimeofday(&time_camera,NULL);
microtime = time_camera.tv_sec * 1000000 + time_camera.tv_usec;
printf("Times is now %llu , time since epoch in seconds is: %lu\n", microtime, time_camera.tv_sec);

It gives me the following output:

>> Times is now 484305845 , time since epoch in seconds is: 1357751315

However, when I break the calculation into different lines, it works! :

unsigned long long microtime=0;
struct timeval time_camera = { .tv_sec=0, .tv_usec=0 };
gettimeofday(&time_camera,NULL);
microtime = time_camera.tv_sec;
microtime = microtime * 1000000;
microtime = microtime  + time_camera.tv_usec;
printf("Times is now %llu , time since epoch in seconds is: %lu\n", microtime, time_camera.tv_sec);

Output is:

Times is now 1357751437422143 , time since epoch in seconds is: 1357751437

Now is it just a coincidence that this works ? e.g have I corrupted memory or actually exceeded it somewhere? Perhaps it's the MIPS compilator? Any help is appreciated!


Solution

  • microtime = time_camera.tv_sec * 1000000 + time_camera.tv_usec;
    

    tv_sec is a smaller integer type (time_t, probably int or long), so

    time_camera.tv_sec * 1000000
    

    overflows. Use a suffix to give the constant the appropriate type

    time_camera.tv_sec * 1000000ULL
    

    In

    microtime = time_camera.tv_sec;
    microtime = microtime * 1000000;
    

    the multiplication is performed at unsigned long long, since one operand (microtime) already has that type, so the other is converted to that.