Advertisement

Advertisement

Camp Fire

noun

  1. a U.S. organization for girls and boys that emphasizes the building of character and good citizenship through work, health, and love; originally founded for girls Camp Fire girls in 1910, it is now open to both boys and girls Camp Fire members.



Advertisement

Advertisement

Advertisement

Advertisement


campfireCamp Fire Boy