the principle of quantum mechanics, formulated by Heisenberg, that the accurate measurement of one of two related, observable quantities, as position and momentum or energy and time, produces uncertainties in the measurement of the other, such that the product of the uncertainties of both quantities is equal to or greater than h/2π, where h equals Planck's constant.
Origin of uncertainty principle
First recorded in 1930–35
Dictionary.com Unabridged Based on the Random House Unabridged Dictionary, © Random House, Inc. 2019
the principle that energy and time or position and momentum of a quantum mechanical system, cannot both be accurately measured simultaneously. The product of their uncertainties is always greater than or of the order of h, where h is the Planck constantAlso known as: Heisenberg uncertainty principle, indeterminacy principle
Collins English Dictionary - Complete & Unabridged 2012 Digital Edition © William Collins Sons & Co. Ltd. 1979, 1986 © HarperCollins Publishers 1998, 2000, 2003, 2005, 2006, 2007, 2009, 2012
A principle, especially as formulated in quantum mechanics, that greater accuracy of measurement for one observable entails less accuracy of measurement for another. For example, it is in principle impossible to measure both the momentum and the position of a particle at the same time with perfect accuracy. Any pair of observables whose operators do not commute have this property. As defined in quantum mechanics, it is also called Heisenberg's uncertainty principle. Similar uncertainty principles hold for non-quantum mechanical systems involving waves as well.
The American Heritage® Science Dictionary Copyright © 2011. Published by Houghton Mifflin Harcourt Publishing Company. All rights reserved.
The New Dictionary of Cultural Literacy, Third Edition Copyright © 2005 by Houghton Mifflin Harcourt Publishing Company. Published by Houghton Mifflin Harcourt Publishing Company. All rights reserved.