vertical_align_top

The American West (2016) (TV Show)

EDIT
Like
Main Details
Media
Publicity
Community
 
Please login to post content on this page.
The American West
pencil
Tagline The Truth is Always Wilder
Genre Western
History
Drama
view all »
pencil

The Western United States (also called the American West, the Far West, and the West) is the region comprising the westernmost states of the United States. As European settlement in the U.S. expanded westward through the centuries, the meaning of the term the West changed. Before about 1800, the crest of the Appalachian Mountains was seen as the western frontier. The frontier moved westward and eventually the lands west of the Mississippi River were considered the West.

View More
Desktop | Mobile
Terms of Use · Copyright · Privacy
© 2006-20, FamousFix · loaded in 0.30s