Search code examples
timetimerserverxnaclient

server and client timing different


I have a client server situation where each side is measuring time, only there seems to be an issue in that the times measured are mismatched. Long story short, the idea is that there is a countdown after which the program needs to do things. I measure this on the server side. But, the countdown needs to be displayed, so what I do is that I separately run it on the client side. The end result is that the client side is showing 23 seconds up when the server sends the message that signals time being up on a countdown of 10 minutes.

Client side is XNA, code:

MillisecCount += gameTime.ElapsedGameTime.Milliseconds;
if (MillisecCount >= 1000)
    {
    MillisecCount -= 1000;
    Timer++;
    }

And Timer is then subtracted from the available time and that's displayed. On the server side this is happening:

async Task timeOut(int delay, CancellationToken ct)
    {
    await Task.Delay(1000 * delay);

    ct.ThrowIfCancellationRequested();
    }

void sendTimeOutMessage(Task t)
    {
    //Send timeout message on network.
    }

void reTime()
    {
    CancellationTokenSource cts = new CancellationTokenSource();
    CancelStack.Push(cts);
    Task t = timeOut(maxTime - Timer, cts.Token);
    t.ContinueWith(sendTimeOutMessage, TaskContinuationOptions.OnlyOnRanToCompletion);
    }

In the test scenario with the 23 second difference reTime() gets called only once, at the beginning of the countdown.


Solution

  • Okay, so it turns out there are times XNA isn't counting time, so that made the client side somewhat slower. Using Stopwatch instead on the client side syncs the two sides.