Answer: One thing was certain: after the Spanish-American War, the United States would never be the same. It had survived for over a hundred years as an isolationist nation, an ocean away from European powers, and emerged as an industrial behemoth in the wake of the Civil War. With its decisive rout of Spain and the acquisition of a far-reaching empire, the United States had arrived as a major player on the world stage.
Explanation:
Hope this helped
:D