Note for revised edition: The following article was originally published in the Journal of Chemical Education in 1999. Corrections and updates are indicated by superscripts in parentheses following the pertinent word or phrase. Although few in number, they clarify important details that you should read along with the test. Therefore, these corrections can be viewed and printed for convenience by clicking here. The superscripts are also linked to their explanations after the end of the article


Shuffled Cards, Messy Desks, and Disorderly Dorm Rooms — Examples of Entropy Increase? Nonsense!

The order of presentation in this article is unusual; its conclusion is first. This is done because the title entails text and lecture examples so familiar to all teachers that most may find a preliminary discussion redundant.

Conclusion

The dealer shuffling cards in Monte Carlo or Las Vegas, the professor who mixes the papers and books on a desk, the student who tosses clothing about his or her room, the fuel for the huge cranes and trucks that would be necessary to move the non bonded stones of the Great Pyramid of Cheops all across Egypt — each undergoes physical, thermodynamic entropy increase in these specific processes. The thermodynamic entropy change from human-defined order to disorder in the giant Egyptian stones themselves, in the clothing and books in a room or papers on a desk, and in the millions of cards in the world's casinos is precisely the same: Zero.

K. G. Denbigh succinctly summarizes the case against identifying changes in position in one macro object or in a group with physical entropy change [1]:

If one wishes to substantiate a claim or a guess that some particular process involves a change of thermodynamic or statistical entropy, one should ask oneself whether there exists a reversible heat effect, or a change in the number of accessible energy eigenstates, pertaining to the process in question. If not, there has been no change of physical entropy (even though there may have been some change in our "information").

Thus, simply changing the location of everyday macro objects from an arrangement that we commonly judge as orderly (relatively singular) to one that appears disorderly (relatively probable) is a "zero change" in the thermodynamic entropy of the objects because the number of accessible energetic microstates in any of them has not been changed. Finally, although it may appear obvious, a collection of ordinary macro things does not constitute a thermodynamic system as does a group of microparticles. The crucial difference is that such things are not ceaselessly colliding and exchanging energy under the thermal dominance of their environment as are microparticles.

A postulate can be derived from this fundamental criterion:

The movement of macro objects from one location to another by an external agent involves no change in the objects' physical (thermodynamic) entropy. The agent of movement undergoes a thermodynamic entropy increase in the process.

A needed corollary, considering the number of erroneous statements in print, is:

There is no spontaneous tendency in groups of macro objects to become disorderly or randomly scattered. The tendency in nature toward increased entropy does not reside in the arrangement of any chemically unchanging objects but rather in the external agent moving them. It is the sole cause of their transport toward more probable locations.

 

The Error

There is no more widespread error in chemistry and physics texts than the identification of a thermodynamic entropy increase with a change in the pattern of a group of macro objects. The classic example is that of playing cards. Shuffling a new deck is widely said to result in an increase in entropy in the cards.

This erroneous impression is often extended to all kinds of things when they are changed from humanly designated order to what is commonly considered disorder: a group of marbles to scattered marbles, racked billiard balls to a broken rack, neat groups of papers on a desk to the more usual disarray. In fact, there is no thermodynamic entropy change in the objects in the "after" state compared to the "before". Further, such alterations in arrangement have been used in at least one text to support a "law" that is stated, "things move spontaneously in the direction of maximum chaos or disorder"1.

The foregoing examples and "law" seriously mislead the student by focusing on macro objects that are only a passive part of a system. They are deceptive in omitting the agent that actually is changed in entropy as it follows the second law — that is, whatever energy source is involved in the process of moving the static macro objects to more probable random locations. Entropy is increased in the shuffler's and in the billiard cue holder's muscles, in the tornado's wind and the earthquake's stress — not in the objects shifted. Chemically unchanged macro things do not spontaneously, by some innate tendency, leap or even slowly lurch toward visible disorder. Energy concentrated in the ATP of a person's muscles or in wind or in earth-stress is ultimately responsible for moving objects and is partly degraded to diffuse thermal energy as a result.

 

Discussion

To discover the origin of this text and lecture error, a brief review of some aspects of physical entropy is useful. Of course, the original definition of Clausius, dS = Dq(rev)/T, applies to a system plus(1) its surroundings, and the Gibbsian relation of dG = dH - TdS pertains to a system at constant pressure(2) and constant temperature. Only in the present discussion (where an unfortunate term, information "entropy", must be dealt with) would it be necessary to emphasize that temperature is integral to any physical thermodynamic entropy change described via Clausius or Gibbs. In our era we are surer even than they could be that temperature is indispensable in understanding thermodynamic entropy because it indicates the thermal environment of microparticles in a system. That environment sustains the intermolecular motions whereby molecules continuously interchange energy and are able to access the wide range of energetic(3) microstates available to them. It is this ever-present thermal motion that makes spontaneous change possible, even at constant temperature and in the absence of chemical reaction, because it is the mechanism whereby molecules can occupy new energetic microstates if the boundaries of a system are altered. Prime examples of such spontaneous change are diffusion in fluids and the expansion of gases into vacua, both fundamentally due to the additional translational energetic microstates in the enlarged systems. (Of course, spontaneous endothermic processes ranging from phase changes to chemical reactions are also due to mobile energy-transferring microparticles that can access new rotational and vibrational as well as translational energetic microstates — in the thermal surroundings as well as in the chemical system.)

Misinterpretation of the Boltzmann equation for entropy change, dS = R/N ln(number of energetic microstates after change/number of energetic microstates before change), is the source of much of the confusion regarding the behavior of macro objects. R, the gas constant, embeds temperature in Boltzmann's entropy as integrally as in the Clausius or Gibbs relation and, t To repeat, the environment's temperature indicates the degree of energy dispersion that makes access to available energy microstates possible. The Boltzmann equation is revelatory in uniting the macrothermodynamics of classic Clausian entropy with what has been described above as the behavior of a system of microparticles occupying energetic microstates.

In discussing how probability enters the Boltzmann equation (i.e., the number of possible energetic microstates and their occupancy by microparticles), texts and teachers often enumerate the many ways a few symbolic molecules can be distributed on lines representing energy levels, or in similar cells or boxes, or with combinations of playing cards. Of course these are good analogs for depicting an energetic microsystem. However, even if there are warnings by the instructor, the use of playing cards as a model is probably intellectually hazardous; these objects are so familiar that the student can too easily warp this macro analog of a microsystem into an example of actual entropic change in the cards.

Another major source of confusion about entropy change as the result of simply rearranging macro objects comes from information theory "entropy".2 Claude E. Shannon's 1948 paper began the era of quantification of information and in it he adopted the word "entropy" to name the quantity that his equation defined [2]. This occurred because a friend, the brilliant mathematician John von Neumann, told him "call it entropy no one knows what entropy really is, so in a debate you will always have the advantage" [3]. Wryly funny for that moment, Shannon's unwise acquiescence has produced enormous scientific confusion due to the increasingly widespread usefulness of his equation and its fertile mathematical variations in many fields other than communications [4, 5]. Certainly most non-experts hearing of the widely touted information "entropy" would assume its overlap with thermodynamic entropy. However, the great success of information "entropy" has been in areas totally divorced from experimental chemistry, whose objective macro results are dependent on the behavior of energetic microparticles. Nevertheless, many instructors in chemistry have the impression that information "entropy" is not only relevant to the calculations and conclusions of thermodynamic entropy but may change them. This is not true.

There is no invariant function corresponding to energy embedded in each of the hundreds of equations of information "entropy" and thus no analog of temperature universally present in them. In contrast, inherent in all thermodynamic entropy, temperature is the objective indicator of a system's energetic state. Probability distributions in information "entropy" represent human selections; therefore information "entropy" is strongly subjective. Probability distributions in thermodynamic entropy are dependent on the microparticulate and physicochemical nature of the system; limited thereby, thermodynamic entropy is strongly objective.

This is not to say that the extremely general mathematics of information theory cannot be modified ad hoc and further specifically constrained to yield results that are identical to Gibbs' or Boltzmann's relations [6]. This may be important theoretically but it is totally immaterial here; such a modification simply supports conventional thermodynamic results without changing them — no lesser nor any greater thermodynamic entropy. The point is that information "entropy" in all of its myriad nonphysicochemical forms as a measure of information or abstract communication has no relevance to the evaluation of thermodynamic entropy change in the movement of macro objects because such information "entropy" does not deal with microparticles whose perturbations are related to temperature.3 Even those who are very competent chemists and physicists have become confused when they have melded or mixed information "entropy" in their consideration of physical thermodynamic entropy. This is shown by the results in textbooks and by the lectures of professors found on the Internet.1

Overall then, how did such an error (concerning entropy changes in macro objects that are simply moved) become part of mainstream instruction, being repeated in print even by distinguished physicists and chemists? The modern term for distorting a photograph, morphing, is probably the best answer. Correct statements of statistical thermodynamics have been progressively altered so that their dependence on the energetics of atoms and molecules is obliterated for the nonprofessional reader and omitted by some author-scientists.

The morphing process can be illustrated by the sequence of statements 1 to 4 below.

  1. Isolated systems of atoms and molecules spontaneously tend to occupy all available energetic microstates thermally accessible to them and tend to change to any arrangement or macro state that provides more such microstates(4). Thus, spontaneous change is toward a condition of greater probability of energy dispersion. After a spontaneous change, the logarithm of the ratio of the number of available microstates(5) to those in the prior state is related to the system's increase in entropy by a constant, R/N per mole. It is the presence of temperature in R that distinguishes physical entropy from all information "entropy".
  2. Systems of atoms and molecules spontaneously tend to go from a less probable state in which they are relatively "orderly" (few microstates(5), low entropy) to one that is more probable in which they are "disorderly" (many microstates, high entropy).
  3. Spontaneous (natural) processes go from order to disorder and entropy increases. Order is what we see as neat, patterned. Disorder is what we see as messy, random.
  4. Things naturally become disorderly.

Most chemists would read statements 3 and 4 with the implications from statement 1 or 2 automatically present in their thoughts. Undoubtedly, a majority are(7) aware that 3 really applies only to atomic and molecular order and disorder. However, most students and nonscientists lack such a background. As is evident from their writing, some physicists err because they ignore or forget the dependence of physical thermodynamic entropy upon atomic and molecular energetic states.

The following recent quote from a distinguished physicist is in the middle of a discussion of the arrangement of books in a young person's room: "The subjective terms 'order' and 'disorder' are quantified by association with probability, and identified, respectively, with low and high entropy." He then informs his readers that "in the natural course of events the room has a tendency to become more disordered."1 (Italics added.)

The phrase "in the natural course of events" implies to a chemist that energy from some source — the internal energy of a substance in a chemical process, the external energy involved as an agent transports(8) a solid object — can powerfully affect macro things in a room, but is this true for most readers? "Naturally" to many students and nonscientists even has the inappropriate connotation "of course" or "as would be expected". Certainly, it does not properly imply a truly complex set of conditions, such as "in nature, where objects can be pushed around by people or windstorms or hail or quakes and where the substances from which they are made can change if their activation energies are exceeded"!

Thus, errors in texts and lectures have arisen because of two types of category slippage: (i) misapplying thermodynamic entropy evaluations — proper in the domain of energetic atoms and molecules — to visibly static macro objects that are unaltered packages of such microparticles, and (ii) misinterpretation of words such as natural (whose common meaning lacks a sense of the external energy needed for any agent to move large visible things.)

Why is there no permanent thermodynamic entropy change in a macro object after it has been transported from one location to another or when a group of them is scattered randomly? Thermodynamic entropy changes are dependent on changes in the dispersal of energy in the microstates of atoms and molecules. A playing card or a billiard ball or a blue sock is a package, a sealed closed system, of energetic microstates whose numbers and types are not changed when the package is transported to a new site from a starting place. All macro objects are like this. Their relocation to different sites does not create any permanent additional energetic microstates within them. (Any temporary heating effects due to the initiation and cessation of the movement are lost to the environment.) Thus, there is a zero change in their physical entropy as a result of being moved.

 

Acknowledgements

I thank Norman C. Craig and the reviewers for invaluable criticism of the original manuscript.

 

Notes

  1. Singling out individual authors from many could appear invidious. Thus, references to quotations or errors are not listed.
  2. It is important that information "entropy" always be in quotes whenever thermodynamic entropy is mentioned in the same article or book. Otherwise, the unfortunate confusion of the past half-century is amplified rather than attenuated.
  3. It has been said that an information "entropy" equation — compared to those for thermodynamic entropy — may look like a duck but, without any empowering thermal energy, it can't quack like a duck or walk like a duck.

Hint: If you arrived at this section by clicking on a reference link, you can go back to where you were via the Back button in your browser.

 

Literature Cited

  1. Denbigh, K. G. Br. J. Philos. Sci. 1989, 40, 323-332.
  2. Shannon, C. E. Bell System Tech. J. 1948, 27, 329-423, 623-656.
  3. Tribus, M.; McIrvine, E. C. Sci. Am. 1971, 225, 180.
  4. Including: Golembiewski, R. T. Handbook of Organizational Behavior; Dekker: New York, 1993.
  5. (9)Hillman, C. Entropy on the World Wide Web; Extensive references to print and WWW sites, primarily information "entropy" but thermodynamic entropy in the physical science. A European mirror site (via China) is at http://www.unibas.ch/mdpi/entropy (accessed June 1999).
  6. Tribus, M. Am. Sci. 1966, 54, 201-210.

 

Copyright © 1999 The Division of Chemical Education, Inc.,
of The American Chemical Society.

 


Corrections and Updates

  1. "plus" should be "or".
  2. "at constant pressure" should be omitted.
  3. "energetic" should be omitted in this and all following expressions of "energetic microstates". It is redundant to prefix the adjective "energetic" to "microstate" because a microstate is always energetic!

    A microstate is the quantum mechanical description of the arrangement at a given instant of all motional energies of all the molecules of a system on the energy levels that are accessible to those energies. Of course, that description is numerical, the enormous number of such accessible molecular arrangements at a specific temperature in any one of which the total energy of the system exists at one moment. In the next instant (after even a collision of only two molecules), the arrangement is different, i.e., the system is in a different microstate. The greater the number of microstates that are accessible for the system at its temperature, the greater is its entropy as per the Boltzmann equation of ΔS = kB ln W, where W = the number of accessible microstates. (See also "What is a microstate".)

  4. This sentence should be replaced by three long sentences: “Isolated systems of atoms and molecules change from one accessible microstate to another (another arrangement of the systems’ molecular energies on energy levels) each instant due to changes in the motional energy of their particles when colliding. If a system undergoes a process, such as being heated or increased in volume or mixed with other substances, there are many more accessible arrangements (microstates) for its molecular energies. Such a changed system has more choices of arrangements at the next instant than did the unchanged system — a condition of less localization of the total energy of the system than if there were fewer microstates, a greater dispersal of that total energy.,” Then follow the next sentences in Statement 1 of the morphing process, “Thus, spontaneous change is toward a greater probability of energy dispersion, etc.”
  5. "available microstates" is better expressed as "accessible microstates".
  6. "few microstates" is misleading. There are no molar systems above 0 K that have few microstates. Even every femtomole system above 0 K has far more than astronomical numbers of microstates. Read "'Order to disorder' for entropy change" for details.
  7. "are" should be "is".
  8. "transports" should be "that transports".

These “Corrections and Updates” are linked from parenthesized superscripts in this article. If you arrived at this section by clicking on a reference link, you can go back to where you were via the Back button in your browser.

 

entropy.lambert@gmail.com
Originally published: J. Chem. Educ. 1999 76 1385
Last revised and updated: November 2005

 

back to entropysite