Electronics - Alternating Current vs Direct Current - Discussion

35. 

How long would it take to transmit an electromagnetic wave to a receiving antenna 1,000 miles away?

[A]. 5.38 ms
[B]. 10.8 ms
[C]. 53.8 ms
[D]. 108 ms

Answer: Option A

Explanation:

No answer description available for this question.

Zahra said: (Jul 18, 2011)  
Can someone explain it please?

Vid said: (Sep 10, 2011)  
time=distance/speed
distance=1000 miles
1km=0.76 mile
time=1000*1000/((3*10^8)*0.76)

I got 4.3 ms

Kumar said: (Apr 24, 2012)  
Time=distance/speed
Distance=1000miles
But 1miles=1.609344km=1.609344*1000m
So the time will be
t=(1.609344*1000*1000)/3*10^8
t=5.36ms

Suresh said: (Nov 15, 2016)  
Please give me the brief explanation.

Anomie said: (Sep 25, 2018)  
Frequency = speed of light (in miles)/wavelength.

F = 186,000/1000
F = 186.
Period = 1/T.
Period = 1/186 = 5.38ms.

Post your comments here:

Name *:

Email   : (optional)

» Your comments will be displayed only after manual approval.