Advertisement

Advertisement

West Coast

noun

  1. the western coast of the U.S., bordering the Pacific Ocean and comprising the coastal areas of California, Oregon, and Washington.



Discover More

Other Word Forms

  • West-Coast adjective

Advertisement

Advertisement

Advertisement

Advertisement


West ChicagoWest Coast jazz