tropical medicine


noun

the branch of medicine dealing with the study and treatment of diseases occurring in the tropics.
Dictionary.com Unabridged Based on the Random House Unabridged Dictionary, © Random House, Inc. 2019

Medicine definitions for tropical medicine

tropical medicine

n.

The branch of medicine that deals with diseases occurring in tropical countries.

The American Heritage® Stedman's Medical Dictionary Copyright © 2002, 2001, 1995 by Houghton Mifflin Company. Published by Houghton Mifflin Company.