Many major-league baseball pitchers can throw the ball at 90 miles per hour. At that speed, how long does it take a pitch to travel from the pitcher’s mound to home plate, a distance of 60 feet 6 inches? Give your answer to the nearest hundredth of a second. There are 5280 feet in a mile and 12 inches in a foot.

Respuesta :

To solve this problem, we must first convert the distance into miles:

distance = 60 ft (1 mile / 5280 ft) + 6 inches (1 foot / 12 inches) (1 mile / 5280 ft)

distance = 0.0114583 mile

 

To calculate for time:

time = distance / speed

time = 0.0114583 mile / (90 miles / 3600 seconds)

time = 0.458 seconds

time = 0.46 seconds

Answer: It would take 0.46 second to travel from the pitcher's mound to home plate.

Step-by-step explanation:

Since we have given that

Speed at which baseball pitchers can throw the ball = 90 miles per hour

Distance covered = 60 feet 6 inches

As we know that

1 mile = 5280 feet

and 1 foot = 12 inches

6 inches = [tex]\dfrac{6}{12}=\dfrac{1}{2}=0.5\ feet[/tex]

So, total feet would be 60 feet +0.5 feet = 60.5 feet

Now,

1 feet = [tex]\dfrac{1}{5280}\ miles[/tex]

So, 60.5 feet = [tex]\dfrac{60.5}{5280}=0.011\ miles[/tex]

so, Time taken by pitch to travel from the pitcher's mound to home plate is given by

[tex]\dfrac{Distance}{Speed}=\dfrac{0.011}{90}=0.00012\ hours\\\\0.00012\times 3600\ seconds=0.458\ second\approx 0.46\ second[/tex]

Hence, it would take 0.46 second to travel from the pitcher's mound to home plate.

ACCESS MORE