Answer:
Westward Expansion and Imperialism Throughout most of the nineteenth century, the United States expanded its territory westward through purchase and annexation. At the end of the century, however, expansion became imperialism, as America acquired several territories overseas.
Explanation: hope this helps