In a distributed system two nodes A and B want to synchronize their clocks. The communications delay along the link A to B is 40ms and along B to A is 20 ms.
These delays are unknown to A and B. These nodes use Cristian's algorithm to synchronize their clocks. Node A's clock is 500 ms and B's clock is 632ms.
And the node A is initiator. After completion, what is the time that A shows?
This sounds like a very academic question.
In practice, it is the variance in the delay that is the problem, not the latency. However, the rule is that you can only see the RTT (Round Trip Time) which is 60ms. Both sides will assume that the delay is a 30ms/30ms split. This means that one side will have a clock 10ms too fast and one 10ms too slow.