West Indies
Americannoun
-
Also called the Indies. (used with a plural verb) an archipelago in the northern Atlantic between North and South America, comprising the Greater Antilles, the Lesser Antilles, and the Bahamas.
-
Also called West Indies Federation. (used with a singular verb) Federation of the West Indies. a former federation (1958–62) of the British islands in the Caribbean, comprising Barbados, Jamaica, Trinidad, Tobago, and the Windward and Leeward island colonies.
noun
Discover More
It is a popular resort area.
Several of the islands were discovered by Christopher Columbus in 1492.
Other Word Forms
- West Indian adjective
Example Sentences
Examples are provided to illustrate real-world usage of words in context. Any opinions expressed do not reflect the views of Dictionary.com.
Bangladesh are due to play the West Indies in Kolkata on February 7, the opening day of the World Cup.
From Barron's
Morris also led England A on tours to South Africa, West Indies and Sri Lanka, while he resumed the Glamorgan captaincy in 1993.
From BBC
He also won three full England caps and captained England A on tours of South Africa, West Indies and Sri Lanka.
From BBC
This is the first time he has been left out of a Test since the 2022 tour of West Indies.
From BBC
Cummins suffered a back problem following the tour of West Indies in July.
From BBC
Definitions and idiom definitions from Dictionary.com Unabridged, based on the Random House Unabridged Dictionary, © Random House, Inc. 2023
Idioms from The American Heritage® Idioms Dictionary copyright © 2002, 2001, 1995 by Houghton Mifflin Harcourt Publishing Company. Published by Houghton Mifflin Harcourt Publishing Company.