the capacity of a computer to perform operations analogous to learning and decision making in humans, as by an expert system, a program for CAD or CAM, or a program for the perception and recognition of shapes in computer vision systems. Abbreviation: AI, A.I.
Origin of artificial intelligence
First recorded in 1965–70
Dictionary.com Unabridged Based on the Random House Unabridged Dictionary, © Random House, Inc. 2019
the study of the modelling of human mental functions by computer programsAbbreviation: AI
Collins English Dictionary - Complete & Unabridged 2012 Digital Edition © William Collins Sons & Co. Ltd. 1979, 1986 © HarperCollins Publishers 1998, 2000, 2003, 2005, 2006, 2007, 2009, 2012
The ability of a computer or other machine to perform actions thought to require intelligence. Among these actions are logical deduction and inference, creativity, the ability to make decisions based on past experience or insufficient or conflicting information, and the ability to understand spoken language.
A Closer Look: The goal of research on artificial intelligence is to understand the nature of thought and intelligent behavior and to design intelligent systems. A computer is not really intelligent; it just follows directions very quickly. At the same time, it is the speed and memory of modern computers that allows researchers to manage the huge quantities of data necessary to model human thought and behavior. An intelligent machine would be more flexible than a computer and would engage in the kind of thinking that people actually do. An example is vision. In theory, a network of sensors combined with systems for interpreting the data could produce the kind of pattern recognition that we take for granted as seeing and understanding what we see. In fact, developing software that can recognize subtle differences in objects (such as those we use to recognize human faces) is very difficult. The recognition of differences that we can perceive without deliberate effort would require massive amounts of data and elaborate guidelines to be recognized by an artificial intelligence system. According to the famous Turing Test, proposed in 1950 by British mathematician and logician Alan Turing, a machine would be considered intelligent if it could convince human observers that another human, rather than a machine, was answering their questions in conversation.
The American Heritage® Science Dictionary Copyright © 2011. Published by Houghton Mifflin Harcourt Publishing Company. All rights reserved.
The means of duplicating or imitating intelligence in computers, robots, or other devices, which allows them to solve problems, discriminate among objects, and respond to voice commands.
The New Dictionary of Cultural Literacy, Third Edition Copyright © 2005 by Houghton Mifflin Harcourt Publishing Company. Published by Houghton Mifflin Harcourt Publishing Company. All rights reserved.