workers' compensation insurance


noun

insurance required by law from employers for the protection of employees while engaged in the employer's business.

Origin of workers' compensation insurance

First recorded in 1975–80
Dictionary.com Unabridged Based on the Random House Unabridged Dictionary, © Random House, Inc. 2019