Advertisement
Advertisement
integralism
[ in-ti-gruh-liz-uhm ]
noun
- the belief that one's religious convictions should dictate one's political and social actions.
Discover More
Word History and Origins
Origin of integralism1
Advertisement
Advertisement
Advertisement
Advertisement
Browse