Misplaced Pages

Entropy (statistical thermodynamics)

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.

This is an old revision of this page, as edited by Pokipsy76 (talk | contribs) at 13:34, 2 February 2011. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Revision as of 13:34, 2 February 2011 by Pokipsy76 (talk | contribs)(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff) For other uses, see statistical thermodynamics.
It has been suggested that Boltzmann's entropy formula be merged into this article. (Discuss) Proposed since July 2009.
It has been suggested that Boltzmann entropy be merged into this article. (Discuss) Proposed since July 2009.

In classical statistical mechanics, the entropy function earlier introduced by Clausius is changed to statistical entropy using probability theory. The statistical entropy perspective was introduced in 1870 with the work of the Austrian physicist Ludwig Boltzmann.

Gibbs Entropy Formula

The macroscopic state of the system is defined by a distribution on the microstates that are accessible to a system in the course of its thermal fluctuations. So the entropy is defined over two different levels of description of the given system. The entropy is given by the Gibbs entropy formula, named after J. Willard Gibbs. For a classical system (i.e., a collection of classical particles) with a discrete set of microstates, if E i {\displaystyle E_{i}} is the energy of microstate i, and p i {\displaystyle p_{i}} is its probability that it occurs during the system's fluctuations, then the entropy of the system is

S = k B i p i ln p i {\displaystyle S=-k_{B}\,\sum _{i}p_{i}\ln \,p_{i}}

Entropy changes for systems in a canonical state

A system with a well-defined temperature, i.e., one in thermal equilibrium with a thermal reservoir, has a probability of being in a microstate i given by Boltzmann's distribution.

Changes in the entropy caused by changes in the external constraints are then given by:

d S = k B i d p i ln p i {\displaystyle dS=-k_{B}\,\sum _{i}dp_{i}\ln p_{i}}
= k B i d p i ( E i / k B T ln Z ) {\displaystyle \,\,\,=-k_{B}\,\sum _{i}dp_{i}(-E_{i}/k_{B}T-\ln Z)}
= i E i d p i / T {\displaystyle \,\,\,=\sum _{i}E_{i}dp_{i}/T}
= i [ d ( E i p i ) ( d E i ) p i ] / T {\displaystyle \,\,\,=\sum _{i}/T}

where we have twice used the conservation of probability, ∑ dpi=0 .

Now, i d (Ei pi) is the expectation value of the change in the total energy of the system.

If the changes are sufficiently slow, so that the system remains in the same microscopic state, but the state slowly (and reversibly) changes, then i (dEi) pi is the expectation value of the work done on the system through this reversible process, dwrev.

But from the first law of thermodynamics, δE = δw +δq. Therefore,

d S = δ q r e v T {\displaystyle dS={\frac {\delta \langle q_{rev}\rangle }{T}}}

In the thermodynamic limit, the fluctuation of the macroscopic quantities from their average values becomes negligible; so this reproduces the definition of entropy from classical thermodynamics, given above.

The quantity k B {\displaystyle k_{B}} is a physical constant known as Boltzmann's constant, which, like the entropy, has units of heat capacity. The logarithm is dimensionless.

This definition remains valid even when the system is far away from equilibrium. Other definitions assume that the system is in thermal equilibrium, either as an isolated system, or as a system in exchange with its surroundings. The set of microstates on which the sum is to be done is called a statistical ensemble. Each statistical ensemble (micro-canonical, canonical, grand-canonical, etc.) describes a different configuration of the system's exchanges with the outside, from an isolated system to a system that can exchange one more quantity with a reservoir, like energy, volume or molecules. In every ensemble, the equilibrium configuration of the system is dictated by the maximization of the entropy of the union of the system and its reservoir, according to the second law of thermodynamics (see the statistical mechanics article).

Neglecting correlations between the different possible states (or, more generally, neglecting statistical dependencies between states) will lead to an overestimate of the entropy. These correlations occur in systems of interacting particles, that is, in all systems more complex than an ideal gas.

This S is almost universally called simply the entropy. It can also be called the statistical entropy or the thermodynamic entropy without changing the meaning. Note the above expression of the statistical entropy is a discretized version of Shannon entropy. The von Neumann entropy formula is an extension of the Gibbs entropy formula to the quantum mechanical case.

Boltzmann's principle

In Boltzmann's definition, entropy is a measure of the number of possible microscopic states (or microstates) of a system in thermodynamic equilibrium, consistent with its macroscopic thermodynamic properties (or macrostate). To understand what microstates and macrostates are, consider the example of a gas in a container. At a microscopic level, the gas consists of a vast number of freely moving atoms, which occasionally collide with one another and with the walls of the container. The microstate of the system is a description of the positions and momenta of all the atoms. In principle, all the physical properties of the system are determined by its microstate. However, because the number of atoms is so large, the motion of individual atoms is mostly irrelevant to the behavior of the system as a whole. Provided the system is in thermodynamic equilibrium, the system can be adequately described by a handful of macroscopic quantities, called "thermodynamic variables": the total energy E, volume V, pressure P, temperature T, and so forth. The macrostate of the system is a description of its thermodynamic variables.

There are three important points to note. Firstly, to specify any one microstate, we need to write down an impractically long list of numbers, whereas specifying a macrostate requires only a few numbers (E, V, etc.). However, and this is the second point, the usual thermodynamic equations only describe the macrostate of a system adequately when this system is in equilibrium; non-equilibrium situations can generally not be described by a small number of variables. For example, if a gas is sloshing around in its container, even a macroscopic description would have to include, e.g., the velocity of the fluid at each different point. Actually, the macroscopic state of the system will be described by a small number of variables only if the system is at global thermodynamic equilibrium. Thirdly, more than one microstate can correspond to a single macrostate. In fact, for any given macrostate, there will be a huge number of microstates that are consistent with the given values of E, V, etc.

We are now ready to provide a definition of entropy. The entropy S is defined as

S = k B ln Ω {\displaystyle S=k_{B}\ln \Omega \!}

where

kB is Boltzmann's constant and
Ω {\displaystyle \Omega \!} is the number of microstates consistent with the given macrostate.

The statistical entropy reduces to Boltzmann's entropy when all the accessible microstates of the system are equally likely. It is also the configuration corresponding to the maximum of a system's entropy for a given set of accessible microstates, in other words the macroscopic configuration in which the lack of information is maximal. As such, according to the second law of thermodynamics, it is the equilibrium configuration of an isolated system. Boltzmann's entropy is the expression of entropy at thermodynamic equilibrium in the micro-canonical ensemble.

This postulate, which is known as Boltzmann's principle, may be regarded as the foundation of statistical mechanics, which describes thermodynamic systems using the statistical behaviour of its constituents. It turns out that S is itself a thermodynamic property, just like E or V. Therefore, it acts as a link between the microscopic world and the macroscopic. One important property of S follows readily from the definition: since Ω is a natural number (1,2,3,...), S is either zero or positive (ln(1)=0, lnΩ≥0.)

Ensembles

The various ensembles used in statistical thermodynamics are linked to the entropy by the following relations:

S = k B ln Ω m i c = k B ( ln Z c a n + β E ¯ ) = k B ( ln Z g r + β ( E ¯ μ N ¯ ) ) {\displaystyle S=k_{B}\ln \Omega _{mic}=k_{B}(\ln Z_{can}+\beta {\bar {E}})=k_{B}(\ln {\mathcal {Z}}_{gr}+\beta ({\bar {E}}-\mu {\bar {N}}))}

Ω m i c {\displaystyle \Omega _{mic}} is the microcanonical partition function
Z c a n {\displaystyle Z_{can}} is the canonical partition function
Z g r {\displaystyle {\mathcal {Z}}_{gr}} is the grand canonical partition function

Lack of knowledge and the second law of thermodynamics

We can view Ω as a measure of our lack of knowledge about a system. As an illustration of this idea, consider a set of 100 coins, each of which is either heads up or tails up. The macrostates are specified by the total number of heads and tails, whereas the microstates are specified by the facings of each individual coin. For the macrostates of 100 heads or 100 tails, there is exactly one possible configuration, so our knowledge of the system is complete. At the opposite extreme, the macrostate which gives us the least knowledge about the system consists of 50 heads and 50 tails in any order, for which there are 100,891,344,545,564,193,334,812,497,256 (100 choose 50) ≈ 10 possible microstates.

Even when a system is entirely isolated from external influences, its microstate is constantly changing. For instance, the particles in a gas are constantly moving, and thus occupy a different position at each moment of time; their momenta are also constantly changing as they collide with each other or with the container walls. Suppose we prepare the system in an artificially highly-ordered equilibrium state. For instance, imagine dividing a container with a partition and placing a gas on one side of the partition, with a vacuum on the other side. If we remove the partition and watch the subsequent behavior of the gas, we will find that its microstate evolves according to some chaotic and unpredictable pattern, and that on average these microstates will correspond to a more disordered macrostate than before. It is possible, but extremely unlikely, for the gas molecules to bounce off one another in such a way that they remain in one half of the container. It is overwhelmingly probable for the gas to spread out to fill the container evenly, which is the new equilibrium macrostate of the system.

This is an example illustrating the Second Law of Thermodynamics:

the total entropy of any isolated thermodynamic system tends to increase over time, approaching a maximum value.

Since its discovery, this idea has been the focus of a great deal of thought, some of it confused. A chief point of confusion is the fact that the Second Law applies only to isolated systems. For example, the Earth is not an isolated system because it is constantly receiving energy in the form of sunlight. In contrast, the universe may be considered an isolated system, so that its total disorder is constantly increasing.

Counting of microstates

In classical statistical mechanics, the number of microstates is actually uncountably infinite, since the properties of classical systems are continuous. For example, a microstate of a classical ideal gas is specified by the positions and momenta of all the atoms, which range continuously over the real numbers. If we want to define Ω, we have to come up with a method of grouping the microstates together to obtain a countable set. This procedure is known as coarse graining. In the case of the ideal gas, we count two states of an atom as the "same" state if their positions and momenta are within δx and δp of each other. Since the values of δx and δp can be chosen arbitrarily, the entropy is not uniquely defined. It is defined only up to an additive constant. (As we will see, the thermodynamic definition of entropy is also defined only up to a constant.)

This ambiguity can be resolved with quantum mechanics. The quantum state of a system can be expressed as a superposition of "basis" states, which can be chosen to be energy eigenstates (i.e. eigenstates of the quantum Hamiltonian.) Usually, the quantum states are discrete, even though there may be an infinite number of them. For a system with some specified energy E, one takes Ω to be the number of energy eigenstates within a macroscopically small energy range between E and E + δE. In the thermodynamical limit, the specific entropy becomes independent on the choice of δE.

An important result, known as Nernst's theorem or the third law of thermodynamics, states that the entropy of a system at zero absolute temperature is a well-defined constant. This is because a system at zero temperature exists in its lowest-energy state, or ground state, so that its entropy is determined by the degeneracy of the ground state. Many systems, such as crystal lattices, have a unique ground state, and (since ln(1) = 0) this means that they have zero entropy at absolute zero. Other systems have more than one state with the same, lowest energy, and have a non-vanishing "zero-point entropy". For instance, ordinary ice has a zero-point entropy of 3.41 J/(mol·K), because its underlying crystal structure possesses multiple configurations with the same energy (a phenomenon known as geometrical frustration).

The third law of thermodynamics states that the entropy of a perfect crystal at absolute zero, or 0 kelvin is zero. This means that in a perfect crystal, at 0 kelvin, nearly all molecular motion should cease in order to achieve ΔS=0. A perfect crystal is one in which the internal lattice structure is the same at all times; in other words, it is fixed and non-moving, and does not have rotational or vibrational energy. This means that there is only one way in which this order can be attained: when every particle of the structure is in its proper place.

However, the equation for predicting quantized vibrational levels shows that even when the vibrational quantum number is 0, the molecule still has vibrational energy. This means that no matter how cold the temperature gets, the molecule will always have vibration. This is in keeping with the Heisenberg uncertainty principle, which states that both the position and the momentum of a particle cannot be known precisely, at a given time:

E ν = h ν 0 ( v + 1 2 ) , {\displaystyle E\nu =h\nu _{0}(v+{\begin{matrix}{\frac {1}{2}}\end{matrix}}),}

where h {\displaystyle h} is Planck's constant, ν 0 {\displaystyle \nu _{0}} is the characteristic frequency of the vibration, and v {\displaystyle v} is the vibrational quantum number. Note that even when v = 0 {\displaystyle v=0} (the zero-point energy), E ν {\displaystyle E\nu } does not equal 0.

See also

References

  • Boltzmann, Ludwig (1896, 1898). Vorlesungen über Gastheorie : 2 Volumes - Leipzig 1895/98 UB: O 5262-6. English version: Lectures on gas theory. Translated by Stephen G. Brush (1964) Berkeley: University of California Press; (1995) New York: Dover ISBN 0-486-68455-5
Categories: