allied health


noun

a segment of healthcare professions comprised of specialized occupations that require certification, including physical therapists, dental hygienists, social workers, speech therapists, nutritionists, etc., but not including doctors, nurses, and dentists.

QUIZZES

DO YOU KNOW THIS VOCABULARY FROM "THE HANDMAID'S TALE"?

"The Handmaid's Tale" was required reading for many of us in school. Everyone else has probably watched the very popular and addictive TV show. Do you remember this vocabulary from the book, and do you know what these terms mean?
Question 1 of 10
decorum

Origin of allied health

First recorded in 1965–70
Dictionary.com Unabridged Based on the Random House Unabridged Dictionary, © Random House, Inc. 2020