Respuesta :

Manifest Destiny refers to the belief that it was America's God-given right to expand into the western areas of the continent and claim them as our own. To settle these lands despite the fact that Native Americans and Mexicans were already there required a great deal of rationalizing on America's part, hence the need to feel that God wanted us to do it.
ACCESS MORE