entropy

entropy
entropic /en troh"pik, -trop"ik/, adj.entropically, adv.
/en"treuh pee/, n.
1. Thermodynam.
a. (on a macroscopic scale) a function of thermodynamic variables, as temperature, pressure, or composition, that is a measure of the energy that is not available for work during a thermodynamic process. A closed system evolves toward a state of maximum entropy.
b. (in statistical mechanics) a measure of the randomness of the microscopic constituents of a thermodynamic system. Symbol: S
2. (in data transmission and information theory) a measure of the loss of information in a transmitted signal or message.
3. (in cosmology) a hypothetical tendency for the universe to attain a state of maximum homogeneity in which all matter is at a uniform temperature (heat death).
4. a doctrine of inevitable social decline and degeneration.
[ < G Entropie (1865); see EN-2, -TROPY]

* * *

Measure of a system's energy that is unavailable for work, or of the degree of a system's disorder.

When heat is added to a system held at constant temperature, the change in entropy is related to the change in energy, the pressure, the temperature, and the change in volume. Its magnitude varies from zero to the total amount of energy in a system. The concept, first proposed in 1850 by the German physicist Rudolf Clausius (1822–1888), is sometimes presented as the second law of thermodynamics, which states that entropy increases during irreversible processes such as spontaneous mixing of hot and cold gases, uncontrolled expansion of a gas into a vacuum, and combustion of fuel. In popular, nontechnical use, entropy is regarded as a measure of the chaos or randomness of a system.

* * *

      the measure of a system's thermal energy per unit temperature that is unavailable for doing useful work. Because work is obtained from ordered molecular motion, the amount of entropy is also a measure of the molecular disorder, or randomness, of a system. The concept of entropy provides deep insight into the direction of spontaneous change for many everyday phenomena. Its introduction by the German physicist Rudolf Clausius (Clausius, Rudolf) in 1850 is a highlight of 19th-century physics.

      The idea of entropy provides a mathematical way to encode the intuitive notion of which processes are impossible, even though they would not violate the fundamental law of conservation of energy (energy, conservation of). For example, a block of ice placed on a hot stove surely melts, while the stove grows cooler. Such a process is called irreversible because no slight change will cause the melted water to turn back into ice while the stove grows hotter. In contrast, a block of ice placed in an ice-water bath will either thaw a little more or freeze a little more, depending on whether a small amount of heat is added to or subtracted from the system. Such a process is reversible because only an infinitesimal amount of heat is needed to change its direction from progressive freezing to progressive thawing. Similarly, compressed gas confined in a cylinder could either expand freely into the atmosphere if a valve were opened (an irreversible process), or it could do useful work by pushing a moveable piston against the force needed to confine the gas. The latter process is reversible because only a slight increase in the restraining force could reverse the direction of the process from expansion to compression. For reversible processes the system is in equilibrium with its environment, while for irreversible processes it is not.

      To provide a quantitative measure for the direction of spontaneous change, Clausius introduced the concept of entropy as a precise way of expressing the second law of thermodynamics (thermodynamics). The Clausius form of the second law states that spontaneous change for an irreversible process in an isolated system (that is, one that does not exchange heat or work with its surroundings) always proceeds in the direction of increasing entropy. For example, the block of ice and the stove constitute two parts of an isolated system for which total entropy increases as the ice melts.

      By the Clausius definition, if an amount of heat Q flows into a large heat reservoir at temperature T above absolute zero, then the entropy increase is ΔS = Q/T. This equation effectively gives an alternate definition of temperature that agrees with the usual definition. Assume that there are two heat reservoirs R1 and R2 at temperatures T1 and T2 (such as the stove and the block of ice). If an amount of heat Q flows from R1 to R2, then the net entropy change for the two reservoirs is

which is positive provided that T1 > T2. Thus, the observation that heat never flows spontaneously from cold to hot is equivalent to requiring the net entropy change to be positive for a spontaneous flow of heat. If T1 = T2, then the reservoirs are in equilibrium, no heat flows, and ΔS = 0.

      The condition ΔS ≥ 0 determines the maximum possible efficiency of heat engines—that is, systems such as gasoline or steam engines that can do work in a cyclic fashion. Suppose a heat engine absorbs heat Q1 from R1 and exhausts heat Q2 to R2 for each complete cycle. By conservation of energy, the work done per cycle is W = Q1 – Q2, and the net entropy change is

To make W as large as possible, Q2 should be as small as possible relative to Q1. However, Q2 cannot be zero, because this would make ΔS negative and so violate the second law. The smallest possible value of Q2 corresponds to the condition ΔS = 0, yielding
as the fundamental equation limiting the efficiency of all heat engines. A process for which ΔS = 0 is reversible because an infinitesimal change would be sufficient to make the heat engine run backward as a refrigerator.

      The same reasoning can also determine the entropy change for the working substance in the heat engine, such as a gas in a cylinder with a movable piston. If the gas absorbs an incremental amount of heat dQ from a heat reservoir at temperature T and expands reversibly against the maximum possible restraining pressure P, then it does the maximum work dW = P dV, where dV is the change in volume. The internal energy of the gas might also change by an amount dU as it expands. Then by conservation of energy, dQ = dU + P dV. Because the net entropy change for the system plus reservoir is zero when maximum work is done and the entropy of the reservoir decreases by an amount dSreservoir = −dQ/T, this must be counterbalanced by an entropy increase of

for the working gas so that dSsystem + dSreservoir = 0. For any real process, less than the maximum work would be done (because of friction, for example), and so the actual amount of heat dQ′ absorbed from the heat reservoir would be less than the maximum amount dQ. For example, the gas could be allowed to expand freely into a vacuum and do no work at all. Therefore, it can be stated that
with dQ′ = dQ in the case of maximum work corresponding to a reversible process.

      This equation defines Ssystem as a thermodynamic state variable, meaning that its value is completely determined by the current state of the system and not by how the system reached that state. Entropy is an extensive property in that its magnitude depends on the amount of material in the system.

      In one statistical interpretation of entropy, it is found that for a very large system in thermodynamic equilibrium, entropy S is proportional to the natural logarithm of a quantity Ω representing the maximum number of microscopic ways in which the macroscopic state corresponding to S can be realized; that is, S = k ln Ω, in which k is the Boltzmann constant that is related to molecular energy.

      All spontaneous processes are irreversible; hence, it has been said that the entropy of the universe (Cosmos) is increasing: that is, more and more energy becomes unavailable for conversion into work. Because of this, the universe is said to be “running down.”

Gordon W.F. Drake
 

* * *


Universalium. 2010.

Игры ⚽ Поможем написать курсовую

Look at other dictionaries:

  • Entropy — Saltar a navegación, búsqueda Entropy es una red Peer to peer descentralizada similar a la Freenet o a la GNUnet y que busca el anonimato de sus usuarios. El programa (que también se llama igual) que la mantiene está escrito en lenguaje C y no en …   Wikipedia Español

  • Entropy — En tro*py, n. [Gr. ? a turning in; ? in + ? a turn, fr. ? to turn.] (Thermodynamics) A certain property of a body, expressed as a measurable quantity, such that when there is no communication of heat the quantity remains constant, but when heat… …   The Collaborative International Dictionary of English

  • Entropy ry — is a non profit student organization for the advancement of electronic music culture at the Helsinki University of Technology. Founded in 1993 it is the oldest active organization of its kind in the Helsinki metropolitan area and one of the… …   Wikipedia

  • entropy — [en′trə pē] n. [Ger entropie, arbitrary use (by R. J. E. Clausius, 1822 88, Ger physicist) of Gr entropē, a turning toward, as if < Ger en(ergie), ENERGY + Gr tropē, a turning: see TROPE] 1. a thermodynamic measure of the amount of energy… …   English World dictionary

  • Entropy — és una red P2P(Peer to peer) descentralizada similar a la Freenet o a la GNUnet y que busca el anonimato de sus usuarios. El programa(que también se llama igual) que la mantiene está escrito en lenguaje C y no en Java como su antecesor Freenet.… …   Enciclopedia Universal

  • entropy — 1868, from Ger. Entropie measure of the disorder of a system, coined 1865 (on analogy of Energie) by German physicist Rudolph Clausius (1822 1888) from Gk. entropia a turning toward, from en in (see EN (Cf. en ) (2)) + trope a turning (see TROPE… …   Etymology dictionary

  • entropy — [n] deterioration breakup, collapse, decay, decline, degeneration, destruction, falling apart, worsening; concepts 230,698 …   New thesaurus

  • entropy — ► NOUN Physics ▪ a thermodynamic quantity expressing the unavailability of a system s thermal energy for conversion into mechanical work, often interpreted as the degree of disorder or randomness in the system. DERIVATIVES entropic adjective.… …   English terms dictionary

  • Entropy — This article is about entropy in thermodynamics. For entropy in information theory, see Entropy (information theory). For a comparison of entropy in information theory with entropy in thermodynamics, see Entropy in thermodynamics and information… …   Wikipedia

  • Entropy — A mathematical measurement of the degree of uncertainty of a random variable. Entropy in this sense is essentially a measure of randomness. It is typically used by financial analysts and market technicians to determine the chances of a specific… …   Investment dictionary

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”