Search code examples
synchronizationdelaypacket

measuring one-way packet delay: any synchronization needed?


I have one machine sending packets at a constant rate (one every x milliseconds) to another one. I need to measure the one-way delay for each packet. One idea could be for the first machine to record the instant when it sent the first packet, and once all transfers have been done, let the second machine know about it, so it can compute the delay for each packet starting from that value.

Do I need any specific synchronization here? I don't know whether this will yield a considerable error on my measurements.


Solution

    1. Send 10k one-way packets to the destination.
    2. Send a "finished" message to the destination
    3. Destination replies with "all received"
    4. Once the reply arrives you know how long it took

    You'll have a tiny systematic error in the measurement because of the confirmation message needed, but that will be small. With real-world latencies it totally works. The calculated latency is: (a*10000+a+b)/(10000+1+1). For a ~ b this term becomes just a.

    Alternatively, tightly synchronize the clocks (difficult) and send timestamps.