Search code examples
c++opengltimerdelay2d-games

C++ Snake clone: timer function ignores given stop time and stops at it's own fixed time


I'm trying to make a Snake clone using C++ and OpenGL/GLUT. However, I've been having trouble programming the short time intervals allowed for input between movements. I've tried a few timing methods, and I ended up making a class for it (as you'll see below). This seems to be the best way to program the input delays (rather than glutTimerFunc() or sleep()), because the timer runs separately from the game loop, instead of putting the whole program on hold. This is important because I want the player to be able to pause at any time. Unfortunately, I'm having issues with this method now too. My timer class seems to ignore the double I give it for the time limit (simply represented as double "limit").

To test the class, I've set up a simple, looping console program that displays directional input from the user at the point when the timer reaches the time limit. It's supposed to display input every 0.33 seconds. Instead, it displays input at fixed intervals that seem to be around 0.8 seconds apart, regardless of what value has been given for the time limit. Why won't it display input at the given time intervals, and why has it made it's own time limit?

This also happens to be my first major C++/OpenGL project without a tutorial, so any comments or advice on my code/methods is appreciated!

#include <iostream>
#include "timer.h"
// Include all files necessary for OpenGL/GLUT here.

using namespace std;

Timer timer;

// Insert necessary OpenGL/GLUT code for display/looping here.

void update(int value)
{
    if (timer.checkTime())
    {
        if (GetAsyncKeyState(VK_LEFT))
            cout << "You pressed LEFT!" << endl;
        else if (GetAsyncKeyState(VK_RIGHT))
            cout << "You pressed RIGHT!" << endl;
        else if (GetAsyncKeyState(VK_UP))
            cout << "You pressed UP!" << endl;
        else if (GetAsyncKeyState(VK_DOWN))
            cout << "You pressed DOWN!" << endl;
    }

    glutTimerFunc(1000/60, update, 0);
    glutPostRedisplay();
}

timer.h

#pragma once
#include <time.h>

class Timer
{
public:
    Timer();
    bool    checkTime(double limit = 0.33);
private:
    double  getElapsed();
    time_t  start;
    time_t  now;
    double  elapsed;
    bool    running;
};

timer.cpp

#include "timer.h"

Timer::Timer()
{
    running = false;
}

bool Timer::checkTime(double limit)
{
    elapsed = getElapsed();

    if (elapsed < limit)
    {
        return false;
    }
    else if (elapsed >= limit)
    {
        running = false;
        return true;
    }
}

double Timer::getElapsed()
{
    if (! running)
    {
        time(&start);
        running = true;
        return 0.00;
    }
    else
    {
        time(&now);
        return difftime(now, start);
    }
}

Solution

  • Every 1000/60 milliseconds the glutTimer fires and you check Timer::checkTime, which calls getElapsed. You have all your time functions defined in terms of double, but you use time_t, which has resolution of only 1 second.

    Therefore, you get something that looks like this (simulated numbers)

    start time: 1234.5 seconds
    glutTimer: 1234.516 seconds  (getElapsed = 0) // checkTime returns false
    glutTimer: 1234.532 seconds  (getElapsed = 0)
    ...
    glutTimer: 1234.596 seconds (getElapsed = 0)
    glutTimer: 1235.012 seconds (getElapsed = 1)  // condition finally returns true
    ...
    

    So, the actual delay depends on when you set your start time relative to the actual start of a second from the Epoch used by time();

    I suspect it averages close to 0.5 seconds if you statistically measure it.

    Answering question about "resolution":

    Different functions for returning the current time return a different level of accuracy. For example, right now, the clock in the bottom right of my screen reads "12:14 PM". Is that exactly 12:14 and no seconds, or 12:14 and 59 seconds? I can't tell because the "resolution" of the clock display is one minute. Similarly, I might say it's a quarter past 12, when it is really 14 minutes past the hour if I report the time in a resolution of "quarter of an hour". As humans we do this all the time without thinking about it. In software you have to be conscious of these details for any function you call.

    If you are on Windows, there is a high-resolution timer available through the QueryPerformanceCounter APIs. On most platforms, the Performance Counter is hardware based and has a resolution in the micro-second range.

    Here's an example of calling it: http://msdn.microsoft.com/en-us/library/windows/desktop/dn553408(v=vs.85).aspx#examples_for_acquiring_time_stamps

    LARGE_INTEGER StartingTime, EndingTime, ElapsedMicroseconds;
    LARGE_INTEGER Frequency;
    
    QueryPerformanceFrequency(&Frequency); // get number of ticks per second
    QueryPerformanceCounter(&StartingTime); // get starting # of ticks
    
    // Activity to be timed
    
    QueryPerformanceCounter(&EndingTime);  // get ending # of ticks
    ElapsedMicroseconds.QuadPart = EndingTime.QuadPart - StartingTime.QuadPart;
    
    
    //
    // We now have the elapsed number of ticks, along with the
    // number of ticks-per-second. We use these values
    // to convert to the number of elapsed microseconds.
    // To guard against loss-of-precision, we convert
    // to microseconds *before* dividing by ticks-per-second.
    //
    
    ElapsedMicroseconds.QuadPart *= 1000000;
    ElapsedMicroseconds.QuadPart /= Frequency.QuadPart;
    

    There may be a similar facility on Linux, but I am not familiar with it.

    Try this:

    void update(int value)
    {
        if (timer.hasTicked())
        {
            if (GetAsyncKeyState(VK_LEFT))
                cout << "You pressed LEFT!" << endl;
            else if (GetAsyncKeyState(VK_RIGHT))
                cout << "You pressed RIGHT!" << endl;
            else if (GetAsyncKeyState(VK_UP))
                cout << "You pressed UP!" << endl;
            else if (GetAsyncKeyState(VK_DOWN))
                cout << "You pressed DOWN!" << endl;
        }
        else if (!timer.isRunning())
        {
           timer.start();
        }
    
        glutTimerFunc(1000/60, update, 0);
        glutPostRedisplay();
    }
    

    timer.h

    // this class provides a timer that can be polled and will allow the user to tell if a period has elapsed.
    // note that this timer does NOT throw any events for timeout.
    class PollTimer
    {
    public:
        PollTimer();
    
        // assuming time limit is a number of msec that and fits within a normal integer rather than the 64 bit  
        // variant (that would be a LONG LONG delay).
        void    setTimeout(int msDelay);
    
        // Timers generally expose start/stop and it’s not generally a good idea to make a single function 
        // that overloads complex combinations of behaviors as here both the start & elapsed operations.
        // admit this is a simple case, but generally it’s a bad design pattern that leads to “evil”.
        void    start();
        void    stop();
        bool    isRunning();
    
        // Has the timer expired since the last poll
        bool    hasTicked();
    
    private:
        LARGE_INTEGER startTime;
        LARGE_INTEGER frequency; // per second
        int     delay; // in milliseconds
        bool    running;
    };
    

    timer.cpp

    #include "timer.h"
    
    PollTimer::PollTimer()
    {
        // omitting error check for hardware that doesn’t support this.
        QueryPerformanceFrequency(& frequency); // get number of ticks per second
        running = false;
    }
    
    void PollTimer::setTimeout(int msDelay)
    {
         delay = msDelay;
    }
    
    void PollTimer::start()
    {
        QueryPerformanceCounter(&startTime);
        running = true;
    }
    
    void PollTimer::stop()
    {
        running = false;
    }
    
    bool PollTimer::isRunning()
    {
        return running;
    }
    
    bool PollTimer::hasTicked()
    {
        if (!running)
            return false;
    
        LARGE_INTEGER now;
        QueryPerformanceCounter(&now);
    
        LARGE_INTEGER ElapsedMilliseconds;
        ElapsedMilliseconds.QuadPart = now.QuadPart - startTime.QuadPart;
    
        ElapsedMilliseconds.QuadPart *= 1000000;
        ElapsedMilliseconds.QuadPart /= frequency.QuadPart; // now microseconds
        ElapsedMilliseconds.QuadPart /= 1000; // milliseconds
    
        bool fExpired = ( ElapsedMilliseconds.HighPart > 0 || ElapsedMilliseconds.LowPart >= delay ) ; 
        if (fExpired)
        {
            // reset start time
            start(); // don’t copy/paste code you can call.
        }
        return fExpired;
    }