- a branch of logic designed to allow degrees of imprecision in reasoning and knowledge, typified by terms such as `very', `quite possibly', and `unlikely', to be represented in such a way that the information can be processed by computer
- A form of algebra employing a range of values from true to false that is used in making decisions with imprecise data. The outcome of an operation is assigned a value between 0 and 1 corresponding to its degree of truth. Fuzzy logic is used, for example, in artificial intelligence systems.
The American Heritage® Science Dictionary Copyright © 2011. Published by Houghton Mifflin Harcourt Publishing Company. All rights reserved.