During the second half of the nineteenth century, America became a continental empire. Between 1850 and 1912, seventeen new western states joined the Union, completing the formation of the contiguous United States. Hundreds of thousands of settlers flocked to these new regions, shifting the center of the country's population dramatically toward the West. The federal government facilitated this western movement in several ways. Most critically, in 1862, Congress passed the Homestead Act, the Pacific Railways Act, and the Morrill Education Act. All three used public lands to achieve national goals: western migration, the construction of a transcontinental railroad, and the development of state colleges.
The addition of these western territories and their integration within a national economy added enormously to the wealth and power of the United States. In addition, the "Wild West" and the "winning of the West" provided important themes to American culture. In 1890, an American census report declared the West "closed." Yet in the years that followed, the West assumed an even more prominent place within American culture. The mythology surrounding the West's homesteading pioneers, political heroes, and frontier clashes grew more elaborate. Similarly, efforts to preserve the West's natural conditions increased. The Sierra Club was formed in 1892 to protect America's wilderness areas. The National Park Service was created in 1916 to manage the nation's parks and ensure that they be left "unimpaired for the enjoyment of future generations."1
Why Should I Care?
In the 1980 film Urban Cowboy, wannabe cowgirl Debra Winger exults as she leaves the bar with John Travolta: "I got me a real cowboy."
Americans have been obsessed with "real cowboys" for more than a century. Some credit Owen Wister's 1992 classic The Virginian with launching this national fascination with the West and with "Westerns." But in truth, Americans have celebrated the breadth of their frontier, the beauties of nature, and the virtues acquired there for much longer. During the 1840s, transcendentalists held that Truth and Knowledge were best discovered in Nature. Henry David Thoreau preached that an individual's most authentic self could be realized only through a close relationship with the natural world.
The aura cast around the ruggedly independent cowboy and the self-driven pioneer reveals that our fascination with the frontier has not diminished. If anything, we have constructed an even more elaborate mythology about the West – about the hearty individualists who conquered this land, and about the titanic battles fought against the forces of nature, and between the various groups that laid claim to this vast space. Most consistently, we have celebrated the self-sufficiency and heroism supposedly bred in the West; the West is consistently portrayed as a land where "men are truly men," where authority is resented, and even assistance is usually declined.
But did you know that during the nineteenth century the federal government played a more active role in subsidizing the West than in any other region of the country?
Did you know that despite all the government assistance, two-thirds of all homesteaders failed within the first five years?Did you know that women received the right to vote first in America's western territories?
Did you know that massive corporations and conglomerates dominated many western territories and states, even in their infancy?
Did you know that within the democratic political insurrections of the West, small farmers and common laborers pleaded for more government support?
Between the mythology and reality of America's western past there are considerable gaps. But if you'd be so kind as to mozy on down this hyere Shmoop trail you might figure out whar these gaps lie.[M1][FA2]