- (on a macroscopic scale) a function of thermodynamic variables, as temperature, pressure, or composition, that is a measure of the energy that is not available for work during a thermodynamic process. A closed system evolves toward a state of maximum entropy.
- (in statistical mechanics) a measure of the randomness of the microscopic constituents of a thermodynamic system. Symbol: S
- (in data transmission and information theory) a measure of the loss of information in a transmitted signal or message.
- (in cosmology) a hypothetical tendency for the universe to attain a state of maximum homogeneity in which all matter is at a uniform temperature (heat death).
- a doctrine of inevitable social decline and degeneration.
Origin of entropy
Examples from the Web for entropy
Contemporary Examples of entropy
Until Tuesday, Occupy Wall Street seemed, at least from the outside, to be entering a stage of entropy.Harsh NYPD Action May Reinvigorate Occupy Wall Street Movement
November 16, 2011
Historical Examples of entropy
It is an entropy of history itself, slowly decaying into chaotic repetition.After the Rain
Fully illustrated and containing eighteen tables, including an entropy chart.Aviation Engines
Victor Wilfred Pag
Anyone who prophesies doom has a hundred per cent chance of ultimately being right, if only because of entropy.Once a Greech
Evelyn E. Smith
The entropy shift must be just right or we'll find ourselves with Hitle and his gang.Cube Root of Conquest
Roger Phillips Graham
Entropy, en′trop-i, n. a term in physics signifying 'the available energy.'
- a thermodynamic quantity that changes in a reversible process by an amount equal to the heat absorbed or emitted divided by the thermodynamic temperature. It is measured in joules per kelvinSymbol: S See also law of thermodynamics
- a statistical measure of the disorder of a closed system expressed by S = k log P + c where P is the probability that a particular state of the system exists, k is the Boltzmann constant, and c is another constant
- lack of pattern or organization; disorder
- a measure of the efficiency of a system, such as a code or language, in transmitting information
Word Origin for entropy
- For a closed thermodynamic system, a quantitative measure of the amount of thermal energy not available to do work.
- A measure of the disorder or randomness in a closed system.
- A measure of the amount of energy in a physical system not available to do work. As a physical system becomes more disordered, and its energy becomes more evenly distributed, that energy becomes less able to do work. For example, a car rolling along a road has kinetic energy that could do work (by carrying or colliding with something, for example); as friction slows it down and its energy is distributed to its surroundings as heat, it loses this ability. The amount of entropy is often thought of as the amount of disorder in a system. See also heat death.
A measure of the disorder of any system, or of the unavailability of its heat energy for work. One way of stating the second law of thermodynamics — the principle that heat will not flow from a cold to a hot object spontaneously — is to say that the entropy of an isolated system can, at best, remain the same and will increase for most systems. Thus, the overall disorder of an isolated system must increase.