Wild West

|

noun

the western frontier region of the U.S., before the establishment of stable government.

Origin of Wild West

An Americanism dating back to 1850–55

Dictionary.com Unabridged Based on the Random House Unabridged Dictionary, © Random House, Inc. 2019

Examples from the Web for wild west


British Dictionary definitions for wild west

Wild West

noun

the western US during its settlement, esp with reference to its frontier lawlessness
Collins English Dictionary - Complete & Unabridged 2012 Digital Edition © William Collins Sons & Co. Ltd. 1979, 1986 © HarperCollins Publishers 1998, 2000, 2003, 2005, 2006, 2007, 2009, 2012