Search code examples
c#.netoptimizationjit

Would C# compiler or Jitter optimize these kinds of arithmetic operations?


Suppose I have something like this:

for (int i = 0; i < 1001; i++)
{
    double step = i / 1000.0;

    // do some math here
}

Basically turning:

double step = i / 1000.0;

into this:

double step = i * 0.001;

I am not sure if this kind of change can be made without changing the result of the program, but was wondering if C# compiler or the jitter does something like this? If not, why? I assume either it's not worth it or they didn't add this optimization yet.


Solution

  • Let's break it down into several questions:

    Can the jitter legally change d / 1000.0 into d * 0.001?

    No, because those two computations give different results. Remember, floating point numbers are binary fractions, not decimal fractions; that 0.001 as a double is not exactly equal to 1 / 1000 any more than 0.333333333 as a double is exactly equal to 1 / 3. 0.001 is the closest fraction to 1/1000 that can be expressed in 52 binary bits. And therefore there are values such that x / 1000.0 does not equal x * 0.001.

    Can the jitter legally change d / 2.0 into d * 0.5?

    Yes. In that case the values can be represented exactly in binary because 1/2 has a small power of two on the bottom.

    The jitter can also change integer divisions and multiplications like x / 2 or x * 2 into x >> 1 or x << 1.

    Does the jitter actually do so when it is legal?

    I don't know. Try it!

    What you'll want to do is compile the program "retail" and then start it up not in the debugger and run it until you know the code in question has been jitted. Then attach the debugger and examine the jitted code. The jitter will generate worse code if it knows that a debugger is attached, because it is trying to generate code that is easier to debug.

    I assume either it's not worth it or they didn't add this optimization yet.

    For the division-to-multiplication case you are assuming that multiplication is faster than division. Modern chips are pretty darn good at both; though division typically does require more bit operations it might be the case that the difference is negligable.