[ dee-jen-duh-rahyz ]
/ diˈdʒɛn dəˌraɪz /
verb (used with object), de·gen·der·ized, de·gen·der·iz·ing.
to free from any association with or dependence on gender: to degenderize employment policies.
to rid of unnecessary reference to gender or of prejudice toward a specific sex: to degenderize textbooks; to degenderize one's vocabulary.
Also de·gen·der; especially British, de·gen·der·ise.
Dictionary.com Unabridged Based on the Random House Unabridged Dictionary, © Random House, Inc. 2019