Respuesta :
World War I had a drastic impact on how African Americans were viewed in the general public in the US. While they were mistreated badly before (they still were later but to a smaller extent). Furthermore, they were increasingly being recognized as important members of society.