tropical medicine


the branch of medicine dealing with the study and treatment of diseases occurring in the tropics. Unabridged Based on the Random House Unabridged Dictionary, © Random House, Inc. 2020

Medical definitions for tropical medicine

tropical medicine


The branch of medicine that deals with diseases occurring in tropical countries.
The American Heritage® Stedman's Medical Dictionary Copyright © 2002, 2001, 1995 by Houghton Mifflin Company. Published by Houghton Mifflin Company.