Dictionary.com
Thesaurus.com

informed consent

American  

noun

  1. a patient's consent to a medical or surgical procedure or to participation in a clinical study after being properly advised of the relevant medical facts and the risks involved.


Etymology

Origin of informed consent

First recorded in 1965–70