Answer:
The total distance he drove is 350 miles, with an average speed of 46.67 miles/hour.
Step-by-step explanation:
Let [tex]t_1[/tex] and [tex]t_2[/tex] be the time of driving at a slower rate and at a faster rate respectively.
Given that the total distance = d
The speed for the first 150 miles = 60 miles/hour.
So, [tex]150=60\times t_1[/tex] [ as distance = speed x time]
[tex]\Rightarrow t_1=2.5[/tex] hours.
The remaining distance [tex]= d-150[/tex] miles.
Speed for the remaining distance = 60 miles/hour.
As the time he spent driving at the faster speed was twice the time he spent driving at the slower speed,
So, the time of driving at a faster rate, [tex]t_2 = 2t_1=2\times2.5=5[/tex] hours
So, [tex]d-150=60\times t_2[/tex] [ as distance = speed x time]
[tex]\Rightarrow d-150=20\times 5=100[/tex]
[tex]\Rightarrow d = 100+150=350[/tex] miles.
The average speed of the journey = (Total distance)/(Total time taken )
[tex]=d/(t_1+t_2)[/tex]
=350/(2.5+5)
=350/7.5
=46.67 miles/hour.
Hence, the total distance he drove is 350 miles, with an average speed of 46.67 miles/hour.