Respuesta :

Answer:

according to depts.washington.edu

World War I was to give women a chance to show a male-dominated society that they could do more than simply bring up children and stay at home. In World War I, women played a vital role in keeping soldiers equipped with ammunition and in many senses they kept the nation moving through their help in various industries.

Explanation:

Answer:

women had many well-known roles such as nurses, factory workers, sewing bandages, and selling war bonds, shipyards and spies.

ACCESS MORE