URGENT
What happened to jobs for women after the end of World War I?
- Women surrendered their jobs to returning soldiers.
- Women sought advanced training to get professional jobs.
- Women were promoted to leadership positions.
- Women were able to get better paying jobs using skills learned while working during the war.