Search code examples
ctimersetitimer

What causes the virtual run time to go slow when using setitimer() and ITIMER_VIRTUAL?


In order to study the differences between ITMER_REAL and ITIMER_VIRTUAL I'v put together the following program.

#include <stdlib.h>   // exit(), EXIT_FAILURE, EXIT_SUCCESS
#include <signal.h>   // sigaction()
#include <stdio.h>    // printf(), fprintf(), stdout, stderr, perror(), _IOLBF
#include <string.h>   // memset()
#include <sys/time.h> // ITIMER_REAL, ITIMER_VIRTUAL, ITIMER_PROF, struct itimerval, setitimer()
#include <stdbool.h>  // true, false 
#include <limits.h>   // INT_MAX

#define TIMEOUT    50             // ms 
#define TIMER_TYPE ITIMER_VIRTUAL // Type of timer.


/* The three types of timers causes different signals.

   type: type of timer, one of ITIMER_REAL, ITIMER_VIRTUAL, or ITIMER_PROF.

   return value: the signal generated by the timer.

 */
int timer_signal(int timer_type) {
  int sig;

  switch (timer_type) {
    case ITIMER_REAL:
      sig = SIGALRM;
      break;
    case ITIMER_VIRTUAL:
      sig = SIGVTALRM;
      break;
    case ITIMER_PROF:
      sig = SIGPROF;
      break;
    default:
      fprintf(stderr, "ERROR: unknown timer type %d!\n", timer_type);
      exit(EXIT_FAILURE);
  }

  return sig;
}


/* Set a timer and a handler for the timer.

   Arguments

   type: type of timer, one of ITIMER_REAL, ITIMER_VIRTUAL, or ITIMER_PROF.

   handler: timer signal handler.

   ms: time in ms for the timer. 

 */
void set_timer(int type, void (*handler) (int), int ms) {
  struct itimerval timer;
  struct sigaction sa;

  /* Install signal handler for the timer. */
  memset (&sa, 0, sizeof (sa));
  sa.sa_handler =  handler;
  sigaction (timer_signal(type), &sa, NULL);

  /* Configure the timer to expire after ms msec... */
  timer.it_value.tv_sec = 0;
  timer.it_value.tv_usec = ms*1000;
  timer.it_interval.tv_sec = 0;
  timer.it_interval.tv_usec = 0;

  if (setitimer (type, &timer, NULL) < 0) {
    perror("Setting timer");
    exit(EXIT_FAILURE);
  };
}

/* Timer signal handler. */
void timer_handler (int signum) {
  static int count = 0;
  fprintf (stderr, "======> timer (%03d)\n", count++);
  set_timer(TIMER_TYPE, timer_handler, TIMEOUT);
}


/* Calculate the nth Fibonacci number using recursion. */
int fib(int n) {
  switch (n) {
    case 0:
      return 0;
    case 1:
      return 1;
    default:
      return fib(n-1) + fib(n-2);
  }
}

/* Print the Fibonacci number sequence over and over again.

   This is deliberately an unnecessary slow and CPU intensive
   implementation where each number in the sequence is calculated recursively
   from scratch.
*/

void fibonacci_slow() {
  int n = 0;
  while (true) {
    printf(" fib(%d) = %d\n", n, fib(n));
    n = (n + 1) % INT_MAX;
  }
}

/* Print the Fibonacci number sequence over and over again.

   This implementation is much faster than fibonacci_slow(). 
*/
void fibonacci_fast() {
  int a = 0;
  int b = 1;
  int n = 0;
  int next = a + b;

  while(true) {
    printf(" fib(%d) = %d\n", n, a);
    next = a + b;
    a = b;
    b = next;
    n++;
    if (next < 0) {
      a = 0;
      b = 1;
      n = 0;
    }
  }
}

int main () {
  /* Flush each printf() as it happens. */
  setvbuf(stdout, 0, _IOLBF, 0);
  setvbuf(stderr, 0, _IOLBF, 0);

  set_timer(TIMER_TYPE, timer_handler, TIMEOUT);

  // Call fibonacci_fast() or fibonacci_fast() 

  fibonacci_fast();
  // fibonacci_slow();

}

From main() I call either fibonacci_fast() or fibonacci_slow().

As expected, when using ITMER_REAL there is no difference in the wall clock time between timer ticks when calling fibonacci_fast() or fibonacci_slow() from main().

When setting a timer using ITIMER_VIRTUAL and main() calls fibonacci_fast() the wall clock time between each timer tick is really long but if main() calls fibonacci_slow() the wall clock time between each timer tick is much much smaller.

I would like to understand why fibonacci_fast() makes the virtual runtime go much slower than fibonacci_slow(). Is the CPU scheduler giving the process much less CPU time when using fibonacci_fast() compared to fibonacci_slow()? If yes, why? If no, what else can explain the difference?


Solution

  • ITIMER_VIRTUAL only runs when the process is running in user mode. In both of your fibonacci routines, there's a printf call that involves a systme call to write data syncronously, which is not counted in the virtual time. fibonacci_fast actually spends most of its time in printf and this system call, so the timer seems to run much more slowly.