I have one machine sending packets at a constant rate (one every x milliseconds) to another one. I need to measure the one-way delay for each packet. One idea could be for the first machine to record the instant when it sent the first packet, and once all transfers have been done, let the second machine know about it, so it can compute the delay for each packet starting from that value.
Do I need any specific synchronization here? I don't know whether this will yield a considerable error on my measurements.
You'll have a tiny systematic error in the measurement because of the confirmation message needed, but that will be small. With real-world latencies it totally works. The calculated latency is: (a*10000+a+b)/(10000+1+1)
. For a
~ b
this term becomes just a
.
Alternatively, tightly synchronize the clocks (difficult) and send timestamps.