West Indies
Americannoun
-
Also called the Indies. (used with a plural verb) an archipelago in the northern Atlantic between North and South America, comprising the Greater Antilles, the Lesser Antilles, and the Bahamas.
-
Also called West Indies Federation. (used with a singular verb) Federation of the West Indies. a former federation (1958–62) of the British islands in the Caribbean, comprising Barbados, Jamaica, Trinidad, Tobago, and the Windward and Leeward island colonies.
noun
Discover More
It is a popular resort area.
Several of the islands were discovered by Christopher Columbus in 1492.
Other Word Forms
- West Indian adjective
Example Sentences
Examples are provided to illustrate real-world usage of words in context. Any opinions expressed do not reflect the views of Dictionary.com.
Plantation slavery was perfected in the West Indies, notably on the sugar islands of British Barbados and present-day Haiti, where the system proved immensely profitable.
From The Wall Street Journal • Apr. 3, 2026
In all, millions of African captives were shipped to the West Indies and Brazil, many times the number taken to British North America.
From The Wall Street Journal • Apr. 3, 2026
In one memorable passage on the French West Indies, he writes:
From The Wall Street Journal • Apr. 3, 2026
The West Indies team is to return home via commercial flights from India after a "distressing" wait following their exit from the men's T20 World Cup.
From BBC • Mar. 10, 2026
Finally, around 1778, he was allowed to sail to the West Indies.
From "George Washington, Spymaster" by Thomas B. Allen
![]()
Definitions and idiom definitions from Dictionary.com Unabridged, based on the Random House Unabridged Dictionary, © Random House, Inc. 2023
Idioms from The American Heritage® Idioms Dictionary copyright © 2002, 2001, 1995 by Houghton Mifflin Harcourt Publishing Company. Published by Houghton Mifflin Harcourt Publishing Company.