- a branch of logic designed to allow degrees of imprecision in reasoning and knowledge, typified by terms such as `very', `quite possibly', and `unlikely', to be represented in such a way that the information can be processed by computer
Collins English Dictionary - Complete & Unabridged 2012 Digital Edition © William Collins Sons & Co. Ltd. 1979, 1986 © HarperCollins Publishers 1998, 2000, 2003, 2005, 2006, 2007, 2009, 2012
- A form of algebra employing a range of values from true to false that is used in making decisions with imprecise data. The outcome of an operation is assigned a value between 0 and 1 corresponding to its degree of truth. Fuzzy logic is used, for example, in artificial intelligence systems.
The American Heritage® Science Dictionary Copyright © 2011. Published by Houghton Mifflin Harcourt Publishing Company. All rights reserved.