Respuesta :

World War II changed the role of corporations in americans life by stopping all the wars and just moving on with there lives and they didn't know what to do after that. Hope this helped, have a great day! :D
Hey there,
World War II transformed the role of the federal government and the lives of American citizens.They secure industrial peace and stabilize war production, the federal government forced reluctant employers to recognize unions. 

Hope this helps :))

~Top
ACCESS MORE