std::chrono
advertises that it can report results down to the nanosecond level. On a typical x86_64 Linux or Windows machine, how accurate would one expect this to be? What would be the error bars for a measurement of 10 ns, 10 µs, 10 ms, and 10 s, for example?
It's most likely hardware and OS dependent. For example when I ask Windows what the clock frequency is using QueryPerformanceFrequency() I get 3903987, which if you take the inverse of that you get a clock period or resolution of about 256 nanoseconds. This is the value that that my operating system reports.
With std::chrono according to the docs the minimum representable duration is high_resolution_clock::period::num / high_resolution_clock::period::den.
The num and den are numerator and denominator. std::chrono::high_resolution_clock tells me the numerator is 1, and the denominator is 1 billion, supposedly corresponding to 1 nanosecond:
std::cout << (double)std::chrono::high_resolution_clock::period::num /
std::chrono::high_resolution_clock::period::den; // Results in a nanosecond.
So according to the std::chrono I have one nanosecond resolution but I don't believe it because the native OS system call is more likely to be reporting the more accurate frequency/period.