Search code examples
c++11visual-c++cross-platformc++-chrono

Different behaviour of system_clock on windows and linux


For the given program I'm getting different results on Windows(VS 17) compare to linux machine(gcc 4.8).

#include "CrossDevelopment.h"
using namespace std;

int main()
{

    for (auto i = 0; i < 3; i++)
    {
//chrono::high_resolution_clock::time_point start_time = chrono::high_resolution_clock::now();
chrono::system_clock::time_point start_time = chrono::system_clock::now();

        for (auto i = 0; i < 50; i++) {
        int a = 10;
        int b = 5;
        int c = a + b;
                c += 10;
                c *= a;
                a *= b;

    }
//chrono::high_resolution_clock::time_point end_time = chrono::high_resolution_clock::now();
chrono::system_clock::time_point end_time = chrono::system_clock::now();
    auto elapsed_time = chrono::duration<double, micro>(end_time - start_time);
    

cout << "Difference of time " << elapsed_time.count() << " " << (end_time - start_time).count() 
<< " " << (chrono::duration_cast<chrono::nanoseconds>(end_time - start_time)).count() << endl;
}

getchar();
return 0;
}

Output On Windows machine

Difference of time 1 10 1000

Difference of time 0.7 7 700

Difference of time 0.7 7 700

On Linux machine

Difference of time 0.806 806 806

Difference of time 0.6 600 600

Difference of time 0.542 542 542


If you see the last columns you will observe the difference. Which is not in case of high_resolution_clock.


Solution

  • The precision of system_clock::time-point is not portable across platforms. But one can easily inspect it, and/or convert it to a known precision as you have done in your question.

    The easiest way to inspect it is to use my date.h header:

    #include "date/date.h"
    #include <iostream>
    
    int
    main()
    {
        using namespace std;
        using namespace std::chrono;
        using date::operator<<;
        auto start_time = system_clock::now();
        auto end_time = system_clock::now();
        cout << end_time - start_time << '\n';
    }
    

    On gcc this is going to output something like:

    1730ns
    

    On Windows:

    17[1/10000000]s
    

    On macOS:

    1µs
    

    Explanation:

    On gcc, system_clock::time_point has nanosecond precision, on Windows, it has precision 1/10'000'000 (100ns), and on macOS, microsecond precision.

    You can inspect the precision without date.h header by looking at system_clock::duration::period::num and system_clock::duration::period::den, which are the numerator and denominator of a fraction that specifies the length in fractions of a second that each tick represents (1 and 10'000'000 on Windows).

    The ability to print out durations with their units (like date.h allows), is currently now in the C++20 specification.