Answer:
Women and African Americans started working jobs that’s were traditionally worked by white men.
Explanation:
With the progression of World War II, many men were drafted and became soldiers. This meant many of the jobs men occupied were now left vacant. In order to keep a strong workforce, Women and African Americans were encouraged to start working (things like factory jobs) and many held jobs even after the end of World War II.