What happened to jobs for women after the end of World War I?

Women surrendered their jobs to returning soldiers.

Women sought advanced training to get professional jobs.

Women formed labor unions to fight discrimination in the workplace.

Women were able to get better paying jobs using skills learned while working during the war.