Search code examples
language-agnosticwhile-loopdelay

Is it a good practice to implement artificial delays in programs?


I've noticed that, every time a program relies on a while loop to keep it open, it uses almost 100% of the CPU. Adding a delay of 20ms puts that number down to 0%.

The following code would max the CPU:

while(executing){
    // Do some things
    if(Quit) executing = 0;
}

But the next wouldn't:

while(executing){
    // Do some things        
    if(Quit) executing = 0;
    delayFunction(20); //20ms delay
}

Is this the right way to do it? If so, what would be considered an adequate delay?

Are there better ways to avoid overloading the CPU?


Solution

  • Performance analyst's number one answer is, "It depends." There are lots of factors to consider, some making voluntary yield() or delay() look good, some making them look bad.

    • What kind of machine and task are we talking about? (Loops using the radio can cause phones to get very hot.)
    • How important is the task at hand? (Cardiac monitors must capture and display samples on schedule.)
    • What else is the machine doing? (VM hosts run lots of guest machines, CPU loops starve other processes.)
    • Will supervisory code preempt the task? (Most OSes will let higher priority tasks take over in mid-loop.)