I'm writing a ping application which measures packet recv/send time between two computers (A and B). It measures A->B->A time, A->B time and B->A time. Now, I've ran into some problems when calculating average time for B->A, it's showing negative values. Computer B is having "faster"/greater time than computer A when receiving the packet. I'm taking time values using my custom class:
public class TimingClass {
public static long getTime() {
return System.currentTimeMillis();
}
}
The output of the application on the client side looks like this:
Time of start: 1483531410095 // Time when the packet was sent from the client
Time on B: 1483531410538 // Time taken on the server and appended to the message
Packet arrival time: 1483531410104 // Time taken when the packet arrived back on the client
13:03:21: Total=272 Rate=30/s Lost=0 AvgRTT=35ms MaxRTT=63 A->B=449 B->A=-414
Now you can see that the client shows smaller time value when the message was returned from the server, than the time taken on the server.
Here is the client/server code, only the parts which recv/send messages:
// This is the sender thread on the client sending 30msgs/s
msgHandler.writeMessage(createMessage());
private String createMessage() {
long time = TimingClass.getTime();
String timeStr = Long.toString(time);
String message = "payload";
messageId++;
return messageId + "%" + timeStr + "-" + message;
}
// This is the server part running on the main thread
while ((msg = msgHandler.readMessage()) != null) {
catcherTime = TimingClass.getTime();
System.out.println("Message arrived: " + msg);
msgHandler.writeMessage(appendTime(msg, catcherTime));
}
// This is the receiving thread on the client side
while ((line = messageIO.readMessage()) != null) {
currentTime = TimingClass.getTime();
diff = currentTime - initTime;
messageAcc++;
numberOfMsgs++;
bufferElement.addListElement(currentTime, line);
if (diff > 1000) {
initTime = TimingClass.getTime();
bufferElement.setMsgAcc(messageAcc);
bufferElement.setMsgNumber(numberOfMsgs);
queue.put((bufferElement));
numberOfMsgs = 0;
bufferElement = new BufferQueueElement();
}
}
From the code, printed values correspond to the above time variables in the following way:
time -> Time of start
catcherTime -> Time on B
currentTime -> Packet arrival time
The client processes the messages after every 1s. So, does anyone have any experiances with this kind of problem or know how to workaround it or solve it?
P.S.
I've tried using System.nanoTime()
but then A->B average time starts showing negative values. Also, one machine is running Windows 10 (client), the other Windows 8.1 (server), and they are connected via a home network. The minimum Java version I'm developing on is 5, both machines have java 8. Also both machines are synced over time.windows.com , I've done it manually before running the app to be sure that they are synced.
This is a known problem in computer science, see Lamport timestamps. From the wiki link:
In a distributed system, it is not possible in practice to synchronize time across entities (typically thought of as processes) within the system; hence, the entities can use the concept of a logical clock based on the events through which they communicate.