Suppose compact fluorescent light bulbs last, on average, 10,000 hours. The standard deviation is 500 hours. What percent of light bulbs burn out within 11,000 hours?
How do you show your work for this problem. I dont know how to solve it.
11,000 hours is 2 standard deviations away from 10,000.
2 standard deviations will give you 95%, 95% of light bulbs will burn out in 11,000 hours. This is a statistics question, which I'm not very good with, so check over this.