Advertisement

Advertisement

Wild West

noun

  1. the western frontier region of the U.S., before the establishment of stable government.



Wild West

noun

  1. the western US during its settlement, esp with reference to its frontier lawlessness

“Collins English Dictionary — Complete & Unabridged” 2012 Digital Edition © William Collins Sons & Co. Ltd. 1979, 1986 © HarperCollins Publishers 1998, 2000, 2003, 2005, 2006, 2007, 2009, 2012
Discover More

Word History and Origins

Origin of Wild West1

An Americanism dating back to 1850–55

Advertisement

Advertisement

Advertisement

Advertisement


Wild WeaselWild West show