Advertisement
Advertisement
West Coast
noun
the western coast of the U.S., bordering the Pacific Ocean and comprising the coastal areas of California, Oregon, and Washington.
Discover More
Other Word Forms
- West-Coast adjective
Advertisement
Advertisement
Advertisement
Advertisement
Browse