Hideki throws a baseball ball upward with an initial velocity of 12 meters/sec. When he throws the ball, it is two meters above the ground. How long will it take the ball to hit the ground?
For this case we have the following kinematic equation that models the problem: h (t) = (1/2) * (- 9.8) * t ^ 2 + 12 * t + 2 Rewriting we have: h (t) = -4.9 * t ^ 2 + 12 * t + 2 When the ball hits the ground we have: -4.9 * t ^ 2 + 12 * t + 2 = 0 Solving the polynomial we have: t1 = -0.16 t2 = 2.61 We ignore the negative root because it is time. Answer: It will take the ball to hit the ground about: t = 2.61 sec