West Indies
Americannoun
-
Also called the Indies. (used with a plural verb) an archipelago in the northern Atlantic between North and South America, comprising the Greater Antilles, the Lesser Antilles, and the Bahamas.
-
Also called West Indies Federation. (used with a singular verb) Federation of the West Indies. a former federation (1958–62) of the British islands in the Caribbean, comprising Barbados, Jamaica, Trinidad, Tobago, and the Windward and Leeward island colonies.
noun
Discover More
It is a popular resort area.
Several of the islands were discovered by Christopher Columbus in 1492.
Other Word Forms
- West Indian adjective
Example Sentences
Examples are provided to illustrate real-world usage of words in context. Any opinions expressed do not reflect the views of Dictionary.com.
In one memorable passage on the French West Indies, he writes:
From The Wall Street Journal • Apr. 3, 2026
Plantation slavery was perfected in the West Indies, notably on the sugar islands of British Barbados and present-day Haiti, where the system proved immensely profitable.
From The Wall Street Journal • Apr. 3, 2026
In all, millions of African captives were shipped to the West Indies and Brazil, many times the number taken to British North America.
From The Wall Street Journal • Apr. 3, 2026
No-one is saying that anymore after this knock and his 97 not out against West Indies.
From BBC • Mar. 5, 2026
I said, “Timothy, you can’t really believe in that” My father had told me about “obediah,” or “voodoo,” in the West Indies.
From "The Cay" by Theodore Taylor
![]()
Definitions and idiom definitions from Dictionary.com Unabridged, based on the Random House Unabridged Dictionary, © Random House, Inc. 2023
Idioms from The American Heritage® Idioms Dictionary copyright © 2002, 2001, 1995 by Houghton Mifflin Harcourt Publishing Company. Published by Houghton Mifflin Harcourt Publishing Company.