economism
Americannoun
noun
-
-
a political theory that regards economics as the main factor in society, ignoring or reducing to simplistic economic terms other factors such as culture, nationality, etc
-
the belief that the main aim of a political group, trade union, etc, is to improve the material living standards of its members
-
-
(often capital) (in Tsarist Russia) a political belief that the sole concern of the working classes should be with improving their living conditions and not with political reforms
Etymology
Origin of economism
Definitions and idiom definitions from Dictionary.com Unabridged, based on the Random House Unabridged Dictionary, © Random House, Inc. 2023
Idioms from The American Heritage® Idioms Dictionary copyright © 2002, 2001, 1995 by Houghton Mifflin Harcourt Publishing Company. Published by Houghton Mifflin Harcourt Publishing Company.