I WILL MARK YOU BRAINLIEST IF YOU HELP!!!!!
A radio signal travels at 3.00 • 10^8 meters per second. How many seconds will it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 4.2*10^7 meters?
A. 1.23*10^16 seconds
B. 1.26*10^15 seconds
C. 1.4*10^1 seconds
D. 1.4*10^-1 seconds

Respuesta :

Answer:

D. 1.4*10^-1 seconds

Step-by-step explanation:

We know that distance = rate * time

We know the rate = 3.00 • 10^8 meters per second

and the distance =4.2*10^7 meters


distance = rate * time

4.2*10^7 meters =  3.00 • 10^8 meters per second * time

Divide each side by  3.00*10^8

4.2*10^7 /3*10^8 =  3.00 • 10^8/3*10^8  * time

4.2*10^7 /3*10^8 =    * time

4.2/3  * 10^7/10^8 = time

When we divide exponents of the same base, we subtract the exponents keeping the base

1.4 * 10^(7-8) = time

1.4 * 10^-1 seconds = time

ACCESS MORE