noun (in the U.S.)
land given free or sold on liberal terms by a state or the federal government, especially to encourage settlement in undeveloped areas.
Origin of donation land
An Americanism dating back to 1775–85
Dictionary.com Unabridged Based on the Random House Unabridged Dictionary, © Random House, Inc. 2019