Camp Fire

noun
  1. a U.S. organization for girls and boys that emphasizes the building of character and good citizenship through work, health, and love; originally founded for girls (Camp Fire girls) in 1910, it is now open to both boys and girls (Camp Fire members).
Dictionary.com Unabridged Based on the Random House Unabridged Dictionary, © Random House, Inc. 2018