Advertisement
Advertisement
Wild West
noun
the western frontier region of the U.S., before the establishment of stable government.
Wild West
noun
the western US during its settlement, esp with reference to its frontier lawlessness
Discover More
Word History and Origins
Origin of Wild West1
An Americanism dating back to 1850–55
Advertisement
Advertisement
Advertisement
Advertisement
Browse