Traveling at an average speed of 50 miles per hour, the trip from point A to point B takes 2 hours. Traveling at an average speed of 40 miles per hour, the same trip takes 2.5 hours.


If time, y, varies inversely with speed, x, how long will the trip take traveling at 45 miles per hour? Round your answer to the nearest hundredth, if necessary.

Respuesta :

Assuming the traveling speed to be constant, the law that involves space, speed and time is

[tex] s = vt [/tex]

So, the sentence "Traveling at an average speed of 50 miles per hour, the trip from point A to point B takes 2 hours" translates to

[tex] s = 50\cdot 2 [/tex]

from which we deduce that A and B are 100 miles apart. In fact, we also have

[tex] s = 40\cdot 2.5 [/tex]

which again yields s = 100.

Now, the question changes the perspective, because it asks for the time needed to travel at a certain speed. Now that we know that the distance is 100 miles, if we travel at 45 miles per hour we have

[tex] 100 = 45t [/tex]

from which we can deduce

[tex] t = \cfrac{100}{45} = 2.\overline{2} [/tex]

which, rounded to the nearest hundreth, is 2.22.

Answer:

A

Step-by-step explanation: