healthcare

[ helth-kair ]
/ ˈhɛlθˌkɛər /

noun Also health care.

the field concerned with the maintenance or restoration of the health of the body or mind.
any of the procedures or methods employed in this field.

adjective Also health-care.

of, relating to, or involved in healthcare: healthcare workers; a healthcare center.

Origin of healthcare

Dictionary.com Unabridged Based on the Random House Unabridged Dictionary, © Random House, Inc. 2019

Medicine definitions for health care

health care

n.

The prevention, treatment, and management of illness and the preservation of mental and physical well-being through the services offered by the medical and allied health professions.

adj.

health-care Of or relating to health care.
The American Heritage® Stedman's Medical Dictionary Copyright © 2002, 2001, 1995 by Houghton Mifflin Company. Published by Houghton Mifflin Company.