Dictionary.com
definitions
  • synonyms

allied health

noun
  1. a segment of healthcare professions comprised of specialized occupations that require certification, including physical therapists, dental hygienists, social workers, speech therapists, nutritionists, etc., but not including doctors, nurses, and dentists.
Show More

Origin of allied health

First recorded in 1965–70
Dictionary.com Unabridged Based on the Random House Unabridged Dictionary, © Random House, Inc. 2018