Indies
Americannoun
-
(used with a plural verb) the Indies. West Indies.
-
(used with a plural verb) East Indies.
-
(used with a singular verb) a region in and near southern and southeastern Asia; India, Indochina, and the East Indies.
noun
-
the territories of S and SE Asia included in the East Indies, India, and Indochina
-
See East Indies
-
See West Indies
Example Sentences
Examples are provided to illustrate real-world usage of words in context. Any opinions expressed do not reflect the views of Dictionary.com.
The U.S. formally recognized Danish sovereignty in 1916 as a condition for its purchase of the Danish West Indies.
From The Wall Street Journal • Apr. 11, 2026
In one memorable passage on the French West Indies, he writes:
From The Wall Street Journal • Apr. 3, 2026
Plantation slavery was perfected in the West Indies, notably on the sugar islands of British Barbados and present-day Haiti, where the system proved immensely profitable.
From The Wall Street Journal • Apr. 3, 2026
West Indies dropped two catches in their defeat by India.
From BBC • Mar. 8, 2026
Instead, she walked and read and did needlework and drank cup after cup of steaming tea from the Indies.
From "The Princess Bride" by William Goldman
![]()
Definitions and idiom definitions from Dictionary.com Unabridged, based on the Random House Unabridged Dictionary, © Random House, Inc. 2023
Idioms from The American Heritage® Idioms Dictionary copyright © 2002, 2001, 1995 by Houghton Mifflin Harcourt Publishing Company. Published by Houghton Mifflin Harcourt Publishing Company.