A BIRD IS 1000 FEET HIGH AND DECIDES TO DROP AT THE RATE OF 2 FEET PER SECOND. WRITE THE EQUATION FOR THIS , SET IT EQUAL TO ZERO AND SOLVE FOR HOW LONG IT WILL TAKE HIM TO LAND ON THE GROUND. REMEMBER TO USE - 2 X TO REPRESENT HIS DROP. SET THE EQUATION EQUAL TO ZERO BECAUSE THAT MEANS HE HAS LANDED -- AND IS NO LONGER UP IN THE AIR.

Respuesta :

Answer:

500 seconds.

Explanation:

it drops 2 ft/sec, so it's altitude decreases 2ft every second, so the slope is negative 2. We start out at 1000ft, so that's our y intercept and we need to set y to 0, as that's the altitude when the bird lands. So in form

y = mx + b, we plug in our values:

0 = -2x + 1000

0 + 2x = -2x + 2x + 1000

2x = 1000

x = 500