Search code examples
c++mathtiminggame-loopjitter

C++ Jittery game loop - how to make it as smooth as possible?


This is my gameloop code:

        while (shouldUpdate)
        {
            timeSinceLastUpdate = 0;
            startTime = clock();

                while (timeAccumulator >= timeDelta)
                {
                    listener.handle();
                    em.update();
                    timeAccumulator -= timeDelta;
                    timeSinceLastUpdate += timeDelta;
                }

            rm.beginRender();
                _x->draw();
            rm.endRender();

            timeAccumulator += clock() - startTime;
        }

It runs almost perfectly, but it has some jitter to it, a few times a second instead of _x (a test entity that all it does in update is x++) moving 1 pixel to the right, it actually moves 2 pixels to the right and it's a noticeable lag/jitter effect. I'm guessing clock() isn't accurate enough. So what could I do to improve this game loop?

If it matters I use SDL and SDL_image.

EDIT: Nothing changed after doing something more accurate than clock. BUT, what I have figured out, is that it's all thanks to timeDelta. This is how timeDelta was defined when I made this post:

double timeDelta = 1000/60;

but when I defined it as something else while messing around...

double timeDelta = 16.666666;

I noticed for the first few seconds the game started, it was smooth as butter. But just a few seconds later, the game stuttered hugely and then went back to being smooth, and repeated itself like that. The more 6's (or anything after the . really) i added, the longer the game was initially smooth and the harder the lag hit when it did. It seems floating errors are attacking. So what can I do then?

EDIT2: I've tried so much stuff now it's not even funny... can someone help me with the math part of the loop? Since that's what's causing this...

EDIT3: I sent a test program to some people, some said it was perfectly smooth and others said it was jittery like how I described it. For anyone that would be willing to test it here it is(src): https://www.mediafire.com/?vfpy4phkdj97q9j

EDIT4: I changed the link to source code.


Solution

  • It will almost certainly be because of the accuracy of clock()

    Use either std::chrono or SDL_GetTicks() to measure time since epoch.

    I would recommend using std::chrono just because I prefer C++ api's, here's an example:

    int main(){
        using clock = std::chrono::high_resolution_clock;
        using milliseconds = std::chrono::milliseconds;
        using std::chrono::duration_cast;
    
        auto start = clock::now(), end = clock::now();
        uint64_t diff;
    
    
        while(running){
            diff = duration_cast<milliseconds>(end - start).count();
            start = clock::now();     
    
            // do time difference related things
    
            end = clock::now();
        }
    }
    

    To only update after a specified delta, you'd do your loop like this:

    int main(){
        auto start = clock::now(), end = clock::now();
        uint64_t diff = duration_cast<milliseconds>(end - start).count();
    
        auto accum_start = clock::now();
        while(running){
            start = clock::now();
            diff = duration_cast<milliseconds>(end - start).count();
    
            if(duration_cast<nanoseconds>(clock::now() - accum_start).count() >= 16666666){
                // do render updates every 60th of a second
                accum_start = clock::now();
            }
    
            end = clock::now();
        }
    }
    

    start and end will both be of the type std::chrono::time_point<clock> where clock was previously defined by us as std::chrono::high_resolution_clock.

    The difference between 2 time_point's will be a std::chrono::duration which can be nanoseconds, milliseconds or any other unit you like. We cast it to milliseconds then get the count() to assign to our uint64_t. You could use some other integer type if you liked.

    My diff is how you should be calculating your timeDelta. You should not be setting it as a constant. Even if you are certain it will be correct, you are wrong. Every frame will have a different delta, even if it is the smallest fraction of a second.

    If you want to set a constant frame difference, use SDL_GL_SetSwapInterval to set vertical sync.

    EDIT

    Just for you, I created this example git repo. Notice in main.cpp where I multiply by diff to get the adjusted difference per frame. This adjustment (or, lack of), is where you are getting your jitteriness.