Advertisement
Advertisement
allied health
noun
- a segment of healthcare professions comprised of specialized occupations that require certification, including physical therapists, dental hygienists, social workers, speech therapists, nutritionists, etc., but not including doctors, nurses, and dentists.
Discover More
Word History and Origins
Origin of allied health1
First recorded in 1965–70
Advertisement
Advertisement
Advertisement
Advertisement
Browse