Radio signals travel at a rate of #3*10^8# meters per second. How many seconds would it take signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of #4.2 * 10^7# meters?

1 Answer
May 27, 2018

#"time" = 0.14 s#

Explanation:

The speed of radio signals is #3*10^8 m/s#.
The distance those signals have to travel to reach the surface of Earth is #4.2*10^7 m#.

The formula that describes speed is
#"speed" = "distance"/"time"#.

Apply a little algebra to that formula and you can have
#"time" = "distance"/"speed"#

So, plugging in the data,
#"time" = (4.2*10^7 cancel(m))/(3*10^8 cancel(m)/s) = 1.4*10^-1 s = 0.14 s#

I hope this helps,
Steve