Business ruled during the years after the Civil War. Just before the Civil War, Congress passed legislation allowing businesses to form corporations without a charter from the U.S. government. After the Civil War, these corporations came to dominate much of American business, and, in the process, to define American life.