Advertisement

Advertisement

allied health

noun

  1. a segment of healthcare professions comprised of specialized occupations that require certification, including physical therapists, dental hygienists, social workers, speech therapists, nutritionists, etc., but not including doctors, nurses, and dentists.


Discover More

Word History and Origins

Origin of allied health1

First recorded in 1965–70

Advertisement

Advertisement

Advertisement

Advertisement


alliedAllier