Search code examples
c#polling

How come adding even a tiny delay during polling reduces CPU usage?


I need to create software that needs to poll hardware device.

static void Main(string[] args)
{
    while(true)
    {
        DoSomething();
        Task.Delay(1).Wait();
    }
}

static void DoSomething(){}

I noticed that if I don't add even a smallest delay then CPU usage goes to 30-40%, but even with 1ms delay the usage remains around couple of percent.

My development environment is .net/c#.

For me and business it doesn't feel necessary to add 1ms delay, but it seems to make a world of a difference.

It feels like it is a good idea to add even a tiny delay. Why?

EDIT:

Given that piece of code above, with empty DoSomething(), why adding Task.Delay(1).Wait(); draws down CPU usage so much? It feels such a trivial thing. How come it has such an impact?


Solution

  • Let's say DoSomething takes 0.1 ms. Then, without the delay, it is executed 10_000 times per second. With the delay it is executed less than 900 times per second. As noted in the comments, the actual delay depends on the OS and can be 15.6 ms, so you may have ~60 calls per second in fact.

    So the amount of work the CPU has to do differs dramatically.