Search code examples
performanceoptimizationlanguage-agnosticcomputer-science

Is there a way to predict (or measure) how long it will take to execute a single line of code?


My question is this:

Given a single line of isolated code (ie., does not parse, read/write to file on HDD, not a subroutine, etc.), is it possible to predict or measure with any degree of accuracy and/or consistency how long the code will take to execute on a given system?

For example, say I have the following code (not in any specific language, just a generalization):

1 |  If x = 1 then
2 |    x = x + 1
3 |  End If

how long would it take to execute line 2?

I am not looking for numbers here; I am just wondering if such a thing is possible or practical.

Thanks!

Update:

Now I am looking for some numbers... If I were to execute a simple For loop that simply sleeps 1 second per iteration, after 60 (actual) minutes, how far off would it be? In other words, can the time taken to evaluate a line of isolated code be considered negligible (assuming no errors or interrupts)?


Solution

  • If you look at documentation such as this you will find how many clock cycles the fundamental operations take on a CPU, and how long those clock cycles are. Translating those numbers into the time taken to perform x = x+1 in your program is an imprecise science, but you'll get some kind of clue by examining the assembly code listing that your compiler produces.

    The science becomes less and less precise as you move from simple arithmetic statements towards large programs and you start hitting all sorts of issues arising from the complexity of modern CPUs and modern operating systems and memory hierarchies and all the rest.