The belief that one sex (usually the male) is naturally superior to the other and should dominate most important areas of political, economic, and social life. Sexist discrimination in the United States in the past has denied opportunities to women in many spheres of activity. Many allege that it still does. (See also affirmative action, Equal Employment Opportunity Commission, glass ceiling, and National Organization for Women.)
The belief that one sex (usually the male) is naturally superior to the other and should dominate most important areas of political, economic, and social life.