For timing an algorithm (approximately in ms), which of these two approaches is better:
clock_t start = clock();
algorithm();
clock_t end = clock();
double time = (double) (end-start) / CLOCKS_PER_SEC * 1000.0;
Or,
time_t start = time(0);
algorithm();
time_t end = time(0);
double time = difftime(end, start) * 1000.0;
Also, from some discussion in the C++ channel at Freenode, I know clock has a very bad resolution, so the timing will be zero for a (relatively) fast algorithm. But, which has better resolution time() or clock()? Or is it the same?
It depends what you want: time
measures the real time while clock
measures the processing time taken by the current process. If your process sleeps for any appreciable amount of time, or the system is busy with other processes, the two will be very different.