West Coast


noun

the western coast of the U.S., bordering the Pacific Ocean and comprising the coastal areas of California, Oregon, and Washington.

Related forms

West-Coast, adjective
Dictionary.com Unabridged Based on the Random House Unabridged Dictionary, © Random House, Inc. 2019

Examples from the Web for west coast