sexism

[ sek-siz-uhm ]
See synonyms for sexism on Thesaurus.com
noun
  1. attitudes or behavior based on traditional stereotypes of gender roles: the underlying sexism in the marketing of dolls to girls and trucks to boys;Her husband saw their home life through a lens of sexism, and never once offered to help with the housework or the kids.

  2. discrimination or devaluation based on a person's sex or gender, as in restricted job opportunities, especially such discrimination directed against women: The investigation found that women face a culture of hostility and sexism.

  1. ingrained and institutionalized prejudice against women: The idea that women are inferior to men is sexism at its purest.

  2. hatred, dislike, or mistrust of women; misogyny.

Origin of sexism

1
First recorded in 1965–70; sex1 + -ism, on the model of racism

word story For sexism

See misogyny.

Other words from sexism

  • an·ti·sex·ism, noun

Words Nearby sexism

Dictionary.com Unabridged Based on the Random House Unabridged Dictionary, © Random House, Inc. 2024

How to use sexism in a sentence

British Dictionary definitions for sexism

sexism

/ (ˈsɛksɪzəm) /


noun
  1. discrimination on the basis of sex, esp the oppression of women by men

Origin of sexism

1
C20: from sex + -ism, on the model of racism

Derived forms of sexism

  • sexist, noun, adjective

Collins English Dictionary - Complete & Unabridged 2012 Digital Edition © William Collins Sons & Co. Ltd. 1979, 1986 © HarperCollins Publishers 1998, 2000, 2003, 2005, 2006, 2007, 2009, 2012

Cultural definitions for sexism (1 of 2)

sexism

The belief that one sex (usually the male) is naturally superior to the other and should dominate most important areas of political, economic, and social life. Sexist discrimination in the United States in the past has denied opportunities to women in many spheres of activity. Many allege that it still does. (See also affirmative action, Equal Employment Opportunity Commission, glass ceiling, and National Organization for Women.)

sexism

The belief that one sex (usually the male) is naturally superior to the other and should dominate most important areas of political, economic, and social life.

The New Dictionary of Cultural Literacy, Third Edition Copyright © 2005 by Houghton Mifflin Harcourt Publishing Company. Published by Houghton Mifflin Harcourt Publishing Company. All rights reserved.