An old computer does a calculation in .5 seconds. A newer model does the same calculation in .2 seconds. What's the ratio of the time it takes the old computer to do the problem to the time it takes the new computer to do the problem?