A common way to measure elapsed time is:
const clock_t START = clock();
// ...
const clock_t END = clock();
double T_ELAPSED = (double)(END - START) / CLOCKS_PER_SEC;
I know this is not the best way to measure real time, but I wonder if it works on a system with a variable frequency CPU. Is it just wrong?
There are system architectures that change the frequency of the CPU but have a separate and constant frequency to drive a system clock. One would think that a clock()
function would return a time independent of the CPU frequency but this would have to be verified on each system the code is intended to run on.