Advertisement
Advertisement
the Ugly American
- Pejorative term for Americans traveling or living abroad who remain ignorant of local culture and judge everything by American standards. The term is taken from the title of a book by Eugene Burdick and William Lederer.
Advertisement
Word of the Day
[tawr-choo-uhs ]
Meaning and examplesStart each day with the Word of the Day in your inbox!
By clicking "Sign Up", you are accepting Dictionary.com Terms & Conditions and Privacy Policies.
Advertisement
Advertisement
Advertisement
Browse