- a decimal point whose location is not fixed, used especially in computer operations.
Dictionary.com Unabridged Based on the Random House Unabridged Dictionary, © Random House, Inc. 2018
- Relating to a method of representing numerical quantities that uses two sets of integers, a mantissa and a characteristic, in which the value of the number is understood to be equal to the mantissa multiplied by a base (often 10) raised to the power of the characteristic. Scientific notation is one means of displaying floating-point numbers.
The American Heritage® Science Dictionary Copyright © 2011. Published by Houghton Mifflin Harcourt Publishing Company. All rights reserved.