Search code examples
pseudocodelaglatency

Tell the difference between lag and normal latency?


In pseudo code, how would you go about distinguishing a normal client's ping (for example in a video game), from a client who is lagging?

Is there an algorithm to find a threshold that would differentiate between normal latency and lag between a client and server (in a FPS/PVP video game for example)?

At what point is latency considered "laggy"?

To repeat, i'm looking for an answer in pseudo code. Not a particular language.


Solution

  • In a multi-player FPS, the game is often called "laggy" if the reality according to the server (server game state) is significantly different than the reality on the various client computers involved (client game state).

    You need to examine the pipeline

    1. User gives input in client (moves, uses a power, etc.)
    2. Input is transmitted from client to server
    3. Server processes input from relevant, connected clients (multi-player)
    4. Server returns results to each connected client (new character position, damage to enemy, damage to self, new cooldown for power, etc.)
    5. Client updates local game state to match server game state.

    Since the ping time between clients and servers far exceeds game rendering frame rates, game clients will typically be able to continue on pending a reply from the server. The alternative would be to render each frame only after the server has replied to all actions (waaaay too slow).

    If the client and server remain largely in agreement about the game state (character position, damage taken, power cooldown, etc.) the user will experience the game as being "smooth".

    If server replies cause the game client to significantly alter the internal state that it had projected on its own, the game will feel "laggy". In some games this manifests as rubber-banding (the game client put the user at a certain position, but updates that position significantly once a reply is received from the server... causing the character to move back and forth as if on a rubber band).

    I'm not aware of any specific threshold for processing the entire pipeline (from user input to authoritative game state update from the server) that constitutes a transition between "smooth" and "laggy". I would hazard to guess if the server and client game states are never out of sync for more than 100ms - 200ms, the perception will be quite smooth. If the game states are out of sync for 1+ seconds (for a PvP or action-intense PvE type game), players will perceive the out-of-syncness as "lag".

    Pseudo-code would be to record the timestamp when input originates and the timestamp when the authoritative server game state is returned based on that input. Take the difference, and calculate a moving average. That moving average represents the time difference between the server state and the client state. If that average crosses a threshold (probably defined by testing a particular game), the experience can be deemed laggy.