ENTROPY
randomness, entropy
(noun) (thermodynamics) a thermodynamic quantity representing the amount of energy in a system that is no longer available for doing mechanical work; “entropy increases as matter and energy in the universe degrade to an ultimate state of inert uniformity”
information, selective information, entropy
(noun) (communication theory) a numerical measure of the uncertainty of an outcome; “the signal contained thousands of bits of information”
Source: WordNet® 3.1
Etymology
Noun
entropy (countable and uncountable, plural entropies)
(thermodynamics, countable)
strictly thermodynamic entropy. A measure of the amount of energy in a physical system that cannot be used to do work.
A measure of the disorder present in a system.
The capacity factor for thermal energy that is hidden with respect to temperature .
The dispersal of energy; how much energy is spread out in a process, or how widely spread out it becomes, at a specific temperature.
(statistics, information theory, countable) A measure of the amount of information and noise present in a signal.
(uncountable) The tendency of a system that is left to itself to descend into chaos.
Synonyms
• anergy
• bound entropy
• disgregation
Antonyms
• aggregation
• exergy
• free entropy
• negentropy
Anagrams
• Poynter, peryton
Source: Wiktionary
En"tro*py, n. Etym: [Gr. (Thermodynamics)
Definition: A certain property of a body, expressed as a measurable
quantity, such that when there is no communication of heat the
quantity remains constant, but when heat enters or leaves the body
the quantity increases or diminishes. If a small amount, h, of heat
enters the body when its temperature is t in the thermodynamic scale
the entropy of the body is increased by h . The entropy is regarded
as measured from some standard temperature and pressure. Sometimes
called the thermodynamic function.
The entropy of the universe tends towards a maximum. Clausius.
Source: Webster’s Unabridged Dictionary 1913 Edition