n. | 1. | (Thermodynamics) A certain property of a body, expressed as a measurable quantity, such that when there is no communication of heat the quantity remains constant, but when heat enters or leaves the body the quantity increases or diminishes. If a small amount, h, of heat enters the body when its temperature is t in the thermodynamic scale the entropy of the body is increased by h ÷ t. The entropy is regarded as measured from some standard temperature and pressure. Sometimes called the |
Noun | 1. | entropy - (communication theory) a numerical measure of the uncertainty of an outcome; "the signal contained thousands of bits of information" Synonyms: selective information, information |
2. | entropy - (thermodynamics) a thermodynamic quantity representing the amount of energy in a system that is no longer available for doing mechanical work; "entropy increases as matter and energy in the universe degrade to an ultimate state of inert uniformity" Synonyms: randomness, S |
(theory) | entropy - A measure of the disorder of a system. Systems tend
to go from a state of order (low entropy) to a state of
maximum disorder (high entropy). The entropy of a system is related to the amount of information it contains. A highly ordered system can be described using fewer bits of information than a disordered one. For example, a string containing one million "0"s can be described using run-length encoding as [("0", 1000000)] whereas a string of random symbols (e.g. bits, or characters) will be much harder, if not impossible, to compress in this way. Shannon's formula gives the entropy H(M) of a message M in bits: H(M) = -log2 p(M) Where p(M) is the probability of message M. |