Dictionary.com
Thesaurus.com

denturism

American  
[den-chuh-riz-uhm] / ˈdɛn tʃəˌrɪz əm /

noun

  1. the practice by denturists of making artificial dentures and fitting them to patients.


Etymology

Origin of denturism

denture + -ism