Dictionary.com
Thesaurus.com

black diet

American  

noun

  1. deprivation of all food and water as a punishment, often leading to death.