[woo m-uh-nist]
  1. believing in and respecting the abilities and talents of women; acknowledging women's contributions to society.
  2. pertaining to a type of feminism that acknowledges the abilities and contributions of black women.
  1. a person who holds or supports womanist views.
Related formswom·an·ism, noun
Dictionary.com Unabridged Based on the Random House Unabridged Dictionary, © Random House, Inc. 2018