Dictionary.com
Thesaurus.com

genital wart

American  

noun

Pathology.
  1. one of a cluster of warts occurring in the genital and anal areas and spread mainly by sexual contact, sometimes affecting the cervix in women and associated with an increased risk of cervical cancer.