A ball is thrown in the air from a height of 3 m with an initial velocity of 12 m/s. To the nearest tenth of a second, how long does it take for the ball to land on the ground?

Respuesta :

Answer:  1.4 seconds

Step-by-step explanation:

The equation is: h(t) = at² + v₀t + h₀          where

  • a is the acceleration (in this case it is gravity)
  • v₀ is the initial velocity
  • h₀ is the initial height

Given:  

  • a = -9.81 (if it wasn't given in your textbook, you can look it up)
  • v₀ = 12
  • h₀ = 3

Since we are trying to find out when it lands on the ground, h(t) = 0

EQUATION:   0 = 9.81t² + 12t + 3

Use the quadratic equation to find the x-intercepts

                        a=-9.81, b=12, c=3

[tex]x=\dfrac{-b \pm \sqrt{b^2-4ac}}{2a}\\\\\\x=\dfrac{-(12)\pm \sqrt{(12)^2-4(-9.81)(3)}}{2(-9.81)}\\\\\\x=\dfrac{-12\pm 16.2}{-19.62}\\\\\\x=\dfrac{-12+ 16.2}{-19.62}=-0.2\qquad x=\dfrac{-12- 16.2}{-19.62}=\large\boxed{1.4}\\[/tex]

Note: Negative time (-0.2) is not valid