Radio signals travel at a rate of #3*10^8# meters per second. How many seconds would it take signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of #4.2 * 10^7# meters?
1 Answer
May 27, 2018
Explanation:
The speed of radio signals is
The distance those signals have to travel to reach the surface of Earth is
The formula that describes speed is
Apply a little algebra to that formula and you can have
So, plugging in the data,
I hope this helps,
Steve