Consider a 100 Mbps link between an earth station (sender) and a satellite (receiver) at an altitude of 2100 km. The signal propagates at a speed of 3x108 m/s. The time taken (in milliseconds, rounded off to two decimal places) for the receiver to
completely receive a packet of 1000 bytes transmitted by the sender is_________. 

Correct Answer:

7.08

Solution:

Given,
Data Size = 1000 Bytes = 8000 Bits
Bandwidth = 100 Mbps = 108 bps

The time required for the receiver to receive the packet is transmission time (Tt)+ propagation time(Tp).Tt=Data SizeBandwidth=8000 bits108 bps= 0.08ms

Tp = LengthSpeed = 2100*1000 m3*108 m/s= 7ms

Total time = 0.08 + 7 = 7.08ms