Search code examples
c#algorithmperformancebenchmarkingtiming

C# performance fluctuation in algorithm


I'm relatively new to algorithms and performance bench marking but I have a few questions.

I've been writing an algorithm which I wanted to be able to process in small iterations at a time to limit the disruption on an already demanding process loop. My goal was to process for under 1ms each iteration so having a fairly consistent performance seems quite important.

Unfortunately after applying a process weight to average the algorithm to 0.5ms I was occasionally achieving times of over 20ms. I did notice that the data collection would cause some issues (I assume it's moving in memory?) which i've since solved but still get some performance fluctuation.

So I've drawn up a blank method here. Even here I get these occasional high times.

    public void DoTask()
    {
        for (int i = 0; i < 100000; i++)
        {
            //do nothing
        }
    }

    private void button1_Click(object sender, EventArgs e)
    {
        DoTask(); //burn

        var watch = new Stopwatch();
        watch.Start();

        double time;
        for (int j = 0; j < 20; j++)
        {
            for (int i = 0; i < 1000; i++)
            {                   
                time = watch.ElapsedTicks;

                DoTask();

                time = watch.ElapsedTicks - time;

                LongestTime = Math.Max(LongestTime, time);

                TotalTime += time;

                Count++;
            }
            double avgTime = TotalTime / Count;

            MessageBox.Show($"Longest time: {ToMs(LongestTime).ToString("#.##")} Avg time: {ToMs(avgTime).ToString("#.##")}");
            TotalTime = 0;
            Count = 0;
            LongestTime = 0;
        }
    }

Longest time: 13.04 Avg time: .01

What is causing this to happen? Is it something that is beyond my control?

Thanks.


Solution

  • It is actually more of Windows problem (or any operating system that supports multitasking but is not Real Time OS, which covers all consumer OSes). All regular multitasking operating systems give each thread period of time that given thread will run without interruptions and than potentially switch to another thread. If execution of your step get suspended in the middle in favor of some other thread time measured by Stopwatch will essentially be length of time thread was waiting to continue execution (which in Windows in ~15ms by default).

    Measuring time correctly is hard task and generally better left to specialized tools - profilers.

    If you want to get slightly more consistent measurements with just in-process timers - raise priority of thread that runs your code. That way thread will have lower chance to be suspended in favor of other threads.

    Notes:

    • real time operating systems generally require some cooperation from executing code to guarantee non-interrupted execution time. Also I don't believe there is such OS that supports C#.
    • even in non-multitasking OS like MS DOS you can see similar random fluctuations of time measured that way as OS have to handle interrupts from devices (disk, keyboard, timers) and as result execution of main single threaded code can be suspended for duration of interrupt handling code (usually small time, but issue is still there).
    • .Net code adds some additional concerns as it has to perform garbage collection which may happen at any point in time (i.e. due to allocation on another thread). More info Fundamentals of Garbage Collection.