Discussion Topic
World War II was one of the most significant events of the 1900s and one of the
most important events in US history. Think about how much the United States
changed between the Great Depression and the postwar era, when the country
had become an economic powerhouse. How did World War influence and
change the identity of the United States throughout the 1900s and into the
present? What are some positive and negative changes that occurred in the
United States in the years after World War II?