I am preparing for an exam in networking.
In one of the previous exams this question was given:
Assume you're sending a packet of length 4000 bit
through a cable of length 1000 km.
The signal in the cable is moving at 200000 km/s.
The signal bandwidth is 10 Mbit/s.
Calculate how much time it would take for the packet to arrive.
If I would have done this with a car, considering road length and car speed, it would probably take 200 seconds. Though I am not sure how to apply the mbit/s and bits in the calculation.
Is this a correct way of doing it?
(10 mbit/s / 4000 bit) * (200000 km/s / 1000 km) = seconds packet needs to arrive
The transfer time equals to SEND_TIME + EXPANSION_TIME
(I use Mbit as 10^6 bit for simplicity instead of 2^20, the principle remains the same)
SEND_TIME = #bits / #bits_per_sec = 4000 / 10*10^6 = 4*10^-4
EXPANSION_TIME = length / expansion_speed = 1000 / 200000 = 5 * 10^-3
total is then 0.0054
seconds
Bonus:
A good practice is to look at the units and make sure you get to the correct units at the end, so the above is actually:
SEND_TIME = #bits / #bits_per_sec = 4000[bit] / 10*10^6[bit/sec] = 4*10^-4 [bit/bit * sec] = 0.0004 [sec]
EXPANSION_TIME = length / speed = 1000 [km] / 200000 [km/sec] = 5 * 10^-3 [km / km * sec] = 0.005 [sec]