Note for revised edition: The following article was originally published in the Journal of Chemical Education in 2002, vol. 79, pp. 187-192. Original notes are indicated by superscripts while corrections and updates are indicated by superscripts in parentheses. If you click on a reference link, you can return to where you were in the article by clicking the Back button in your browser.



Disorder — A Cracked Crutch For Supporting Entropy Discussions

This article decries the use of "disorder" in teaching beginning students about thermodynamic entropy.  It is cautionary rather than proscriptive about "disorder" being used warily as a device for assessing entropy change in advanced work or among professionals.1

Introduction

To aid students in visualizing an increase in entropy many elementary chemistry texts use artists' before-and-after drawings of groups of "orderly" molecules that become "disorderly".  This has been an anachronism ever since the ideas of quantized energy levels were introduced in elementary chemistry.  "Orderly-disorderly" seems to be an easy visual support but it can be so grievously misleading as to be characterized as a failure-prone crutch rather than a truly reliable, sturdy aid.2

After mentioning the origin of this visual device in the late 1800s and listing some errors in its use in modern texts, I will build on a recent article by Daniel F. Styer.  It succinctly summarizes objections from statistical mechanics to characterizing higher entropy conditions as disorderly (1).  Then after citing many failures of "disorder" as a criterion for evaluating entropy — all educationally unsettling, a few serious, I will urge the abandonment of order-disorder in introducing entropy to beginning students.  Although it seems plausible, it is vague and potentially misleading, a non-fundamental description that does not point toward calculation or elaboration in elementary chemistry, and an anachronism since the introduction of portions of quantum mechanics in first-year textbooks.3

Entropy's nature is better taught by first describing its dependence on the dispersion of energy (in classic thermodynamics), and the distribution of energy among a large number of molecular motions, relatable to quantized states, microstates (in molecular thermodynamics).4  Increasing amounts of energy dispersed among molecules result in increased entropy that can be interpreted as molecular occupancy of more microstates.  (High-level first-year texts could go further to a page or so of molecular thermodynamic entropy as described by the Boltzmann equation.).

The History and Use of "Disorder" to Characterize Entropy

As is well-known, in 1865 Clausius gave the name "entropy" to a unique quotient for the process of a reversible change in thermal energy divided by the absolute temperature (2).  He could properly focus only on the behavior of chemical systems as macro units because in that era there was considerable doubt even about the reality of atoms.  Thus, the behavior of molecules or molecular groups within a macro system were totally a matter of conjecture (as Rankine unfortunately demonstrated in postulating "molecular vortices") (3).  Later in the nineteenth century, but still prior to the development of quantum mechanics, the greater "disorder" of a gas at high temperature compared to its distribution of velocities at a lower temperature was chosen by Boltzmann to describe its higher entropy (4).  However, "disorder" was a crutch, i.e., it was a contrived support for visualization rather than a fundamental physical or theoretical cause for a higher entropy value.  Others followed Boltzmann's lead; Helmholtz in 1882 called entropy "Unordnung" (disorder) (5), and Gibbs Americanized that description with "entropy as mixed-up-ness", a phrase found posthumously in his writings (6) and subsequently used by many authors.

Most general chemistry texts today still lean on this conceptual crutch of order-disorder either slightly with a few examples or as a major support that too often fails by leading to extreme statements and over extrapolation.  The most egregious errors in the past century of associating entropy with disorder have occurred simply because disorder is a common language word with non-scientific connotations.  Whatever Boltzmann meant by it, there is no evidence that he used disorder in any sense other than strict application to molecular energetics.  But over the years, popular authors have learned that scientists talked about entropy in terms of disorder, and thereby entropy has become a code word for the "scientific" interpretation of everything disorderly from drunken parties to dysfunctional personal relationships,5 and even the decline of society.6

Of course, chemistry instructors and authors would disclaim any responsibility for such absurdities.  They would insist that they never have so misapplied entropy, that they used disorder only as a visual or conceptual aid for their students in understanding the spontaneous behavior of atoms and molecules, entropy-increasing events.

But it was not a social scientist or a novelist — it was a chemist — who discussed entropy in his textbook with "things move spontaneously [toward] chaos or disorder".7  Another wrote, "Desktops illustrate the principle [of] a spontaneous tendency toward disorder in the universe…".7   It is nonsense to describe the "spontaneous behavior" of macro objects in this way:  that things like sheets of paper, immobile as they are, behave like molecules despite the fact that objects' actual movement is non-spontaneous and is due to external agents such as people, wind, and earthquake.  That error has been adequately dismissed (7).  The important point here is that this kind of mistake is fundamentally due to a focus on disorder rather than on the correct cause of entropy change, energy flow toward dispersal.  Such a misdirected focus leads to the kind of hyperbole one might expect from a science-disadvantaged writer, "Entropy must therefore be a measure of chaos", but this quote is from an internationally distinguished chemist and author.7, 8

Entropy is not disorder.  Entropy is not a measure of disorder or chaos.  Entropy is not a driving force.  Energy's diffusion, dissipation, or dispersion in a final state compared to an initial state is the driving force in chemistry.  Entropy is the index of that dispersal within a system and between the system and its surroundings.4  In thermodynamics, entropy change is a quotient that measures the quantity of the unidirectional flow of thermal energy by DS > Dq/T.  An appropriate paraphrase would be "entropy change measures energy's dispersion at a stated temperature".  This concept of energy dispersal is not limited to thermal energy transfer between system and surroundings.  It includes redistribution of the same amount of energy in a system, e.g., when a gas is allowed to expand into a vacuum container, resulting in a larger volume.  In such a process where dq is zero, the total energy of the system has become diffused over a larger volume and thus an increase in entropy is predictable.  (Some call this an increase in configurational entropy.)(1) 

From a molecular viewpoint, the entropy of a system depends on the number of distinct microscopic quantum states, microstates, that are consistent with the system's macroscopic state.  (The expansion of a gas into an evacuated chamber mentioned above is found, by quantum mechanics, to be an increase in entropy that is due to more microstates being accessible because the spacing of energy levels decreases in the larger volume.)  The general statement about entropy in molecular thermodynamics can be: "Entropy measures the dispersal of energy among molecules in microstates.  An entropy increase in a system involves energy dispersal among(2a) more microstates in the system's final state than in its initial state."  It is the basic sentence to describe entropy increase in gas expansion, mixing, crystalline substances dissolving, phase changes and the host of other phenomena now inadequately described by "disorder" increase. 

In the next section the molecular basis for thermodynamics is briefly stated.  Following it are ten examples to illustrate the confusion that can be engendered by using "disorder" as a crutch to describe entropy in chemical systems.

The Molecular Basis of Thermodynamics 

            The four paragraphs to follow include a paraphrase of Styer's article, "Insight into entropy" in the American Journal of Physics (1).9

            In statistical mechanics, many microstates usually correspond to any single macrostate.  (That number is taken to be one for a perfect crystal at absolute zero.)  A macrostate is measured by its temperature, volume, and number of molecules; a group of molecules in microstates ("molecular configurations", a microcanonical ensemble) by their energy, volume, and number of molecules.

            In a microcanonical ensemble the entropy is found simply by counting:  One counts the number W of microstates that correspond to the given macrostate,10 and computes the entropy of that macrostate by Boltzmann's relationship, S = kB ln W, where kB is Boltzmann's constant.11

            Clearly, S is high for a macrostate when many microstates correspond to that macrostate, whereas it is low when few microstates correspond to the macrostate.  In other words, the entropy of a macrostate measures the number of ways in which a system can be different microscopically (i.e., molecules be very different in their energetic distribution) and yet still be a member of the same macroscopic state.

To put it mildly, considerable skill and wise interpretation are required to translate this verbal definition into quantitative expressions for specific situations.  (Professor Styer's article describes some conditions for such evaluations and calculations.)  Nevertheless, the straightforward and thoroughly established conclusion is that the entropy of a chemical system is a function of the multiplicity of molecular energetics.  From this, it is equally straightforward that an increase in entropy is due to an increase in the number of microstates in the final macrostate.  This modern description of a specifiable increase in the number of microstates (or better, groups of microstates)(2b) contrasts greatly with any common definition of "disorder", even though "disorder" was the best Boltzmann could envision in his time for the increase in gas velocity distribution.

There is no need today to confuse students with nineteenth century ad hoc ideas of disorder or randomness and from these to create pictures illustrating "molecular disorder".  Any valid depiction of a spontaneous entropy change must be related to energy dispersal on a macro scale or to an increase in the number of accessible microstates on a molecular scale.

Examples of "Disorder" as a Broken Crutch for Supporting Illustrations of Entropy

1. Entropy Change in a Metastable Solid-Liquid Mixture (1)

            This example, a trivial non-issue to chemists who see phenomena from a molecular standpoint and always in terms of system plus surroundings, can be confusing to naive adults or beginning chemistry students who have heard that "entropy is disorder".  It is mentioned only to illustrate the danger of using the common language word disorder.

An ordinary glass bowl containing water that has cracked ice floating in it portrays macro disorder, irregular pieces of a solid and a liquid.  Yet, the spontaneous change in the bowl contents is toward an apparent order:  In a few hours there will be only a homogeneous transparent liquid.  Of course, the dispersal of energy from the warmer room surroundings to the ice in the system is the cause of its melting.  However, to the types of individuals mentioned who have little knowledge of molecular behavior and no habit pattern of evaluating possible energy interchange between a system and its surroundings, this ordinary life experience can be an obstacle to their understanding.  It will be especially so if disorder as visible non-homogeneity or mixed-up-ness is fixed in their thinking as signs of spontaneity and entropy increase.  Thus, in some cases, with some groups of people, this weak crutch can be more harmful than helpful.

            A comparable dilemma (to those who have heard only that "entropy is disorder" and that it spontaneously increases over time) is presented when a vegetable oil is shaken with water to make a disorderly emulsion of oil in water (8b).  However (in the absence of an emulsifier), this metastable mixture will soon separate into two "orderly" layers.  Order to disorder?  Disorder to order?  These are not fundamental criteria nor driving forces.  It is the chemical and thermodynamic properties of oil and of water that determine such phase separation. 

            The following examples constitute significantly greater challenges than do the foregoing to the continued use of disorder in teaching about entropy.

2. Expansion of a Gas into a Vacuum (7)

            When this spontaneous process is portrayed in texts with little dots representing molecules as in Figure 1, the use of "disorder" as an explanation to students for an entropy increase becomes either laughable or an exercise in tortuous rationalization.  Today's students may instantly visualize a disorderly mob crowded into a group before downtown police lines.  How is it that the mob becomes more disorderly if its individuals spread all over the city?  The professor who responds with his/her definition must realize that he/she is particularizing a common word that has multiple meanings and even more implications.  As was well stated:"We cannot therefore always say that entropy is a measure of disorder without at times so broadening the definition of "disorder" as to make the statement true by [our] definition only"(10).

            Furthermore, the naïve student who has been led to focus on disorder increase as an indicator of entropy increase, and is told that ΔS is positive in Figure 1, could easily be confused in several ways.  For example, there has been no change in the number of particles (or the temperature or q), so the student may conclude that entropy increase is intensive (besides the Clausius equation's being "erroneous", with a q = 0).  The molecules are more spread out, so entropy increase looks like it is related to a decrease in concentration.  Disorder as a criterion of entropy change in this example is worse than even a double-edged sword.

How much clearer it is to say simply that if molecules can move in a larger volume, this allows them to disperse their original energy more widely in that larger volume and thus their entropy increases.  Alternatively, from a molecular viewpoint, in the larger volume there are more closely spaced — and therefore more accessible — microstates(3) for translation without any change in temperature. 

In texts or classes where the quantum mechanical behavior of a particle in a box has been treated, the expansion of a gas with N particles can be described in terms of microenergetics.  Far simpler for other classes is the example of a particle of mass m in a one-dimensional box of length L (where n is an integer, the quantum number, and h is Planck's constant):  E = (n2h2)/(8mL2).  If L is increased, the possible energies of the single particle get closer together. As a consequence, if there were many molecules rather than one particle, the density of the states available to them would increase with increasing L.  This result holds true in three dimensions, the microstates become closer together, more accessible to molecules within a given span of energy.(4)

3. Doubling the Amount of a Gas or Liquid, in Terms of Disorder

Does any text that uses disorder in describing entropy change dare to put dots representing ideal gas molecules in a square, call that molecular representation disorderly, attach it to another similar square while eliminating the barrier lines, and call the result more disorderly, as in Figure 2?  Certainly the density of the dots is unchanged in the new rectangle, so how is the picture more "disorderly"?  In the preceding Example 2, if the instructor used a diagram involving molecular-dot arrangements, an implication any student could draw was that entropy change was like a chemical concentration change; entropy was therefore an intensive property.  Yet, here in Example 3, the disorder-description of entropy must be changed to the opposite, to be extensive!  With just these two simple examples, the crutch of disorder for categorizing entropy to beginning students can be seen to be broken — not just weak. 

(Generally, as in this example, entropy is extensive.  However, its additivity is not true for all systems (11a). 

4. Monatomic Gases: Massive Versus Light Atoms (1, but with Helium Atoms)

              Helium atoms move much more rapidly than do those of krypton at the same temperature.  Therefore, any student who has been told about disorder and entropy would predict immediately that a mole of helium would have a higher entropy than a mole of krypton because the helium atoms are so much more wildly ricocheting around in their container.  That of course is wrong.  Again, disorder proves to be a broken crutch to support deductions about entropy.  Helium has a standard state entropy of 126 J/K mol whereas krypton has the greater S0, 164 J/K mol.

The molecular thermodynamic explanation is not obvious but it fits with energetic considerations whereas "disorder" does not.  The heavier krypton actually does move more slowly than helium.  However, krypton's greater mass, and greater range of momenta, results in closer spacing of energy levels and thus more microstates for dispersing energy than in helium.

5. The Crystallization of Supercooled Water, A Metastable System

NOTE: In this example and the one that follows, students are confused about associating entropy with order arising in a system only if they fail to consider what is happening in the surroundings (and that this includes the solution in which a crystalline solid is precipitating, prior to any subsequent transfer to the environment).  Thus, they should repeatedly be reminded to think about any observation as part of the whole, the system plus its surroundings. When orderly crystals form spontaneously in these two examples, focusing on entropy change as energy dispersal to or from a system and its surroundings is clearly a superior view to one that depends on a superficiality like disorder in the system (even plus the surroundings).  Example 7 is introduced only as a visual illustration of the failure of order-disorder as a reliable indicator of entropy change in a complex system.

            Students who believe that spontaneous processes always yield greater disorder could be somewhat surprised when shown a demonstration of supercooled liquid water at many degrees below 00 C.  The students have been taught that liquid water is disorderly compared to solid ice.  When a seed of ice or a speck of dust is added, crystallization of some of the liquid is immediate.  Orderly solid ice has spontaneously formed from the disorderly liquid. 

            Of course, thermal energy is evolved in the process of this thermodynamically metastable state changing to one that is stable.  Energy is dispersed from the crystals, as they form, to the solution and thus the final temperature of the crystals of ice and liquid water are higher than originally.  This, the instructor ordinarily would point out as a system-surroundings energy transfer.  However, the dramatic visible result of this spontaneous process is in conflict with what the student has learned about the trend toward disorder as a test of spontaneity.  Such a picture might not take a thousand words of interpretation from an instructor to be correctly understood by a student, but they would not be needed at all if the misleading relation of disorder with entropy had not been mentioned.

6. The Crystallization of Supersaturated Solutions, Metastable Systems (9, 12)

            In many texts the dissolving of a crystalline solid in water is shown in a drawing as an increase in disorder among the ions or molecules in the solid and the drawing is said to illustrate an increase in entropy.  In general, solutions are described as having a higher entropy than a crystalline solid in contact with water prior to any dissolution.  Thus, a demonstration involving a supersaturated solution of sodium sulfate is unsettling to students who have been erroneously assured that spontaneous processes always move in the direction of increased disorder. 

Either by jarring the flask containing the clear supersaturated solution at room temperature or by adding a crystal of sodium sulfate, the flask of "disorderly" solution spontaneously becomes filled with "orderly" crystals.  Furthermore, the flask becomes cool.  A student who has been conditioned to think in terms of order and disorder is not just confused but doubly confused:  Orderly crystals have formed spontaneously and yet the temperature has dropped.  Disorder has not only spontaneously changed to order but the change was so favored energetically that thermal energy was taken from the surroundings.   ("Triply confused" might describe a student who is focused on order and disorder rather than on energetics and the chemistry in comparing Examples 5 and 6.  In 5, the temperature rises when supercooled water crystallizes to ice because of thermal energy evolution (and energy dispersal to the surroundings) during crystal formation in a monocomponent liquid-solid system.  In 6, crystallization of the sodium sulfate from aqueous solution results in a temperature drop because anhydrous sodium sulfate is precipitating; it is one of the minority of solutes that decrease in solubility with temperature increase.  Thus, energy is dispersed to the solid system from the solution surroundings as the sodium sulfate forms from the metastable supersaturated solution.  No convoluted, verbalism-dependent discussion of order-disorder is needed.

7. Liquid Crystals (1)

            If students are shown drawings of the arrangements of rod-like molecules that form liquid crystals, they would readily classify the high temperature liquid "isotropic" phase as disorderly, the liquid "nematic" (in which the molecules are oriented but their spatial positions still scattered) as somewhat orderly, and the liquid "smectic" phase (wherein the molecules are not only oriented but tending to be in sheets or planes) as very orderly.  The solid crystal, of course, would be rated the most orderly.

            Subsequently, the students would yawn when told that a hot liquid crystal (isotropic) of a particular composition (with the acronym of 6OCB) changes into a nematic phase when the liquid is cooled.  Disorderly to more orderly, what can be more expected than that when liquid crystals drop in temperature?  Cooling the nematic phase then yields the even more orderly smectic phase.  Yawn.  Continuing to cool the 6OCB now forms the less orderly nematic phase again.  The students may not instantly become alert at hearing this, but most instructors will:  The pictorially less orderly nematic phase at some temperatures has more entropy than the smectic and at some temperatures may have less entropy. 

Entropy is not dependent on disorder.

8. Microvisible Order in Small Particles Caused by Random Motion of Molecules

            Recent publications have thoroughly established that order in groups of small particles, easily visible under a low-power microscope, can be caused spontaneously by Brownian-like movement of smaller spheres that in turn is caused by random molecular motion (13-16).  These findings therefore disprove the old qualitative idea that disorder or randomness is the inevitable outcome of molecular motion, a convincing argument for abandoning the word disorder in discussing the subject of entropy. 

Only a selected few references are given here.  Their proof of the fact that entropy can increase in a process that at the same time increases geometric order was the clinching evidence for a prominent expert in statistical mechanics to discard order-disorder in his writing about entropy.12

9. Disorder Is in the Eye of the Beholder (1)

            Students and professors are usually confident that they can recognize disorder whenever they see it.  Styer found that their confidence was misplaced when he tested their evaluation of "lattice gas models", patterns used in a wide variety of studies of physical phenomena (1).  Lattice gas models can be in the form of a two-dimensional grid in which black squares may be placed.  With a grid of 1225 empty spaces and 169 black squares to be put on it in some arrangement, Styer showed two examples to students and professors and asked them which "had the greater entropy", in obtaining their estimation of the more disorderly arrangement. 

Most of those questioned were wrong; they saw "patterns" in the configuration that belonged to the class that would have the greater entropy not less — the configuration they should have called more disorderly if they truly could discern disorder from the appearance of a single "still" from an enormous stack of such "stills". 

            Calculations by Styer evaluated similar diagrams in a book about entropy by a prominent chemist.  It has been widely-read by scientists and non-scientists alike.  His results showed that five diagrams that were checked were invalid; they had probably been selected to appear disorderly but were statistically not truly random.  Humans see patterns everywhere.  Conversely, we can easily be fooled into concluding that we are seeing disorder and randomness where there are actually complex patterns.

10. The ad hoc Nature of Disorder as a Descriptor of Entropy

            Besides the failures of disorder as a guide to entropy, a profound objection to the use of disorder as a tool to teach entropy to beginners is its dead-end nature for them.  It does not lead to any quantitative treatment of disorder-entropy relations on their level.  It does not serve as a foundation for more complex theoretical treatments that explain chemical behavior to them.  (Complex applications of order-disorder are not pertinent to the first-year chemistry course, nor to most physical chemistry texts and courses.1)

In contrast, a simple qualitative emphasis on entropy increase as the result of the dispersal of energy generally and the occupancy of more microstates specifically(5) constitutes an informal introduction to statistical mechanics and quantum mechanics.  Later, this can be seen as a natural beginning to substantial work in physical chemistry.  A focus on energy flow as the key to understanding changes in entropy is not a dead-end or an end in itself.  It is seminal for future studies in chemistry.

Discussion

Entropy is seen by many as a mystery, partly because of the baggage it has accumulated in the popular view, partly because its calculation is frequently complex, partly because it is truly an unusual thermodynamic quantity — not just a direct measurement, but a quotient of two measurements: thermal flow divided by temperature.  Qualitative statements such as the following about why or how molecules can absorb precise amounts of macro thermal flow, and about entropy change in a few basic processes, can remove much of the confusion in the minds of students about entropy even if the calculations may not be any easier. 

The Entropy of a Substance (Its Entropy Change from 0 K)

In a macro view the "entropy" of a substance at T is actually an entropy change from 0 K to T, the integral (from 0 K to T) of the thermal energy divided by T that has been dispersed within it from some source.  It is thus an index of the energy that has diffused into the particular type of substance over that temperature range.  (Heat capacity is a specific "per degree" property; entropy is the integration of that kind of dissipation of energy within a substance from absolute zero, including phase changes and other energy-involving transitions.) 

In a micro view the quantity of energy that can be dispersed in the molecules of a substance is clearly due to their state — warm solids have absorbed or "soaked up" more energy than cold solids because their molecules vibrate back and forth more in the warm solid crystals.  Liquids have dissipated or "soaked up" more than solids due to their moving more freely and rotating (and perhaps vibrating within themselves).  Even more energy has been dispersed in gases than liquids in occupying some of the even greater number of microstates that are available(6) in the gaseous state.  With this kind of view (rather than a focus on disorder), the entropy of molecules in a solid, liquid or gas phase loses its mystery.  The amount of energy dispersed within a substance at T (measured by its entropy) is the amount its molecules need to move normally at that temperature, i.e. to occupy energetic states consistent with that temperature.  In reactions, the ΔS between products and reactants measures precisely how much more or how much less energy/T is required for the existence of the products than the original reactants at the stated T.

Some Entropy Changes in Processes

The expansion of a gas in a vacuum has been described in an initial section of this article and in Example 2.  It is especially important because it is troublesome to most students when they are told that there is no thermal energy transferred, and thus no q to "plug in" to dq/T.  Therefore, this is a useful teaching tool because students under slight stress are more receptive to learn the idea of measurement of entropy change by restoration of the system to its original state via a slow compression, the reversible inverse of the expansion.  However, as detailed in Example 2, it should also be the occasion for emphasizing that entropy change is fundamentally an increase in the dispersion of energy and for introducing a molecular thermodynamic explanation for the increase in entropy with volume:  a greater dispersion or spreading of the original, unchanged amount of energy within the system because there are more closely spaced microstates(7) in the new larger volume.

Entropy change in a number of other basic processes can be seen to be related to that in the expansion of a gas.  The mixing of different ideal gases and of liquids fundamentally involves an expansion of each component in the phase involved.  (This is sometimes called a configurational entropy change.)  Of course the minor constituent is most markedly changed as its entropy increases because its energy is now more spread out or dispersed in the microstates of the considerably greater volume than its original state.  (The "Gibbs Paradox" of zero entropy change when samples of the same ideal gas are mixed is no paradox at all in quantum mechanics where the numbers of microstates in a macrostate are enumerated, but will not be treated here.)

This is not the occasion for an extensive presentation of the use of energy dispersal in evaluating entropy change.  These significant examples of the nature of entropy content of a substance and of the expansion of gases are indications of the utility of the concept in contrast to the verbalisms of order and disorder.

Disorderly molecules can be shown as scattered dots in an artist's drawing.  This kind of clear picture can be equally clearly misleading in teaching the concept of entropy.  Entropy change is an energy-dependent property.  That is its essence.  It cannot be grasped by an emphasis on the neatness of before and after snapshots of dots in space.  

As discussed here in detail, disorder as a description of entropy fails because it was needed (and therefore introduced as an aid) in an era when the nature of molecules and their relation to energy was just emerging.  In the 21st century these relationships are relatively well known.  Disorder, a conceptual crutch for supporting a new idea in the 19th century, is now not only needless but harmful because it diverts students' attention from the driving force that is the cause of spontaneous change in chemistry, energy dispersal to(8) more microstates than were present in the initial state.  Disorder is too frequently fallible and misleading to beginners.  It should be abandoned in texts and replaced with a simple introduction to entropy change as related to the dissipation of energy and to, at least, a qualitative view of the relation of microstates to a system's macroscopic state.(9)

 

Acknowledgements

            I thank Ralph Baierlein (17) and Norman Craig for their invaluable aid and Brian Laird (14) and Walter T. Grandy, Jr. (11) for their support for this article.  The suggestions of the reviewers were substantial and essential in improving the ms.

Notes

  1. Rigorous order-disorder theory is soundly applied to gas adsorption, binary solutions, alloys, and ferromagnetism in ref 18, simplified for some examples in ref 19, and treated in depth in ref 20.  An order-disorder treatment of the entropy of fusion of organic compounds is in ref 21.
  2. At the conclusion of a fundamental analysis of Gibbs vs. Boltzmann  entropies, E. T. Jaynes says "statements …that " entropy measures randomness" are…totally meaningless, and present a serious barrier to any real understanding" (22).  Dingle concludes "pictures of entropy as disorder are a most inessential visualization which has probably done much more harm than good" (23).  Wright (9), Denbigh (8), and McGlashan (12) support these statements in extended discussions and with examples that are mentioned later in the present article.  Recently, Bartell has presented a P/T diagram of liquid helium indicating that below 0.3 K it must be heated to crystallize (24), and cites Brostow's 's vigorous characterization of entropy-disorder as "persistent myth" (25).
  3. Reference (26) is cited in a spirit of urging modernization in introducing entropy to students, not in the pejorative sense it was first used in the Journal of the American Chemical Society (ref no. 8 in 27).
  4. Physicist Harvey S. Leff has developed a thorough theoretical foundation for the concept, calling it "Thermodynamic entropy: The spreading and sharing of energy" (28).
  5. Thomas Pynchon's early short story "Entropy" (29) and his later manic novels probably convinced more people than anyone before him that entropy was an apt metaphor for every conceivable human situation that could be called disorderly. 
  6. Brooks Adams found the second law his foundation for the "Law of Civilization and Decay" (30), and his better-known brother Henry saw it as an agent of social decline and foretold a state of chaos (31).  In his best-selling (and ostensibly non-fictional) "Entropy", Rifkin's absurd "the life and death of new organisms increase the entropy of the earth, meaning that less available matter [sic, italics inserted] exists for the unfolding of life in the future" (32, p 38) ultimately soars toward "Everywhere we look, the entropy of our world is reaching staggering proportions. We have become creatures struggling to maintain ourselves in the midst of growing chaos" (32, p 205).
  7. Singling out individual textbook authors who may be innocently repeating century-old embedded errors could appear invidious.  Thus, references to quotations are not listed.
  8. Admittedly, no chemist has ever gone so far as some writers who are quoted by a philosopher to say that "we [will] eventually be able to find some way of using all the unavailable energy known as entropy", and "the universe recycles every bit of the entropy it produces" (33)
  9. Styer's brief statistical mathematical description, analysis of erroneous depiction of entropic situations via small squares in a planar grid, and extensive annotated references are useful (1).  I respectfully disagree with his conclusions (not mentioned in this present article), namely, that adding a new idea ("freedom") as analogy, when combined with "disorder" and used cautiously and not too literally, will provide considerable insight in qualitatively understanding entropy.  In my opinion, the introduction of another common language word to be used with disorder, no matter how carefully re-defined, would lead to augmented confusion in the discussion of entropy by students, professors, and popular writers.
  10. More precisely, the number of states in a chosen small energy range.
  11. Boltzmann never wrote the equation in this form attributed to him nor did he mention a proportionality constant k.  It was Planck who, in December 1900, found that a k, R/N, united Clausius' S with Boltzmann's logarithm of the probability of a state.  However, Planck graciously did not object that his new fundamental definition of k was called Boltzmann's constant and designated as kB (34).
  12. W. T. Grandy, Professor of Physics, Emeritus, University of Wyoming, Laramie, WY. (11).  Personal Communication, 2000.

Literature Cited

  1. Styer, D. F. Am. J. Phys. 2000, 68, 1090-1096.
  2. Clausius, R. (Poggendorff's) Ann. der Phys. und Chem. 1865, 125 (2), 353-399.
  3. Rankine, W. J. M. Proc. Roy. Soc. Edinburgh 1850, 2, 275-286.
  4. Boltzmann, L. Lectures on Gas Theory; Brush, S. G, Translator; University of California Press: Berkeley, CA, 1964; pp 442-443.
  5. Helmholtz, H. Wissenschaftliche Abhandlungen, Vol. 2; Johann Barth: Leipzig, 1883; p. 972.
  6. Gibbs, J. W. Collected Works, Vol. 1; Yale University Press: New Haven, CT, 1928; p. 418.
  7. Lambert, F. L. J. Chem. Educ. 1999, 76, 1385-1387.
  8. (a) Denbigh, K. G.; Denbigh, J. S. Entropy in relation to incomplete knowledge; Cambridge University Press: New York, 1985; pp 43-44. (b) Denbigh, K. G. Br. J. Philos. Sci. 1989, 40, 328-332.
  9. Wright, P. G. Contemp. Phys. 1970, 11, 581-88.
  10. Darrow, K. K. Am. J. Phys. 1944, 12, 183-196; p 195.
  11. (a) Grandy, W. T. Jr. Some Issues in Therodynamics & Statistical Mechanics. VI. On The Presumed Extensive Character of Entropy. (b) VIII. Some Secondary Issues, Order and Disorder.
  12. McGlashan, M. L. J. Chem. Educ. 1966, 43, 231-232; Chemical Thermodynamics; Academic Press: New York, 1979; pp 111-113.
  13. Adams, M.; Dogic, Z.; Keller, S.L.; Fraden, S. Nature 1998, 393, 349-352 and references therein.
  14. Laird, B. B. J. Chem. Educ. 1999, 76, 1388-1390.
  15. Dinsmore, A. E.; Wong, D. T.; Nelson, P.; Yodh, A. G. Phys. Rev. Letters 1998, 80, 409-412.
  16. Frenkel, D.  Phys. World 1993, 6, 24-25.
  17. Baierlein, R. Thermal Physics; Cambridge University Press: New York, 1999.
  18. Honig, M. J. J. Chem. Educ. 1961, 38, 538-548.
  19. Honig, M. J. J. Chem. Educ. 1980, 57, 557-559.
  20. Hijmans, J.; De Boer, J. Physica 1955, 21, 471-484, 485-498, 499-516.
  21. Gilson, D. F. R. J. Chem. Educ. 1992, 69, 23-25.
  22. Jaynes, E. T. Am. J. Phys. 1965, 33, 391-398.
  23. Dingle, H. Bull. Inst. Phys. 1959, 10, 218-219.
  24. Bartell, L. S. J. Chem. Educ. 2001, 78, 1059-1067.
  25. Brostow, W. Science 1972, 178, 121.
  26. St. Paul, First Epistle to the Corinthians. A. D. 58, Chap. 13, 9-11.
  27. Swain, C. G.; Pegues, E. E. J. Am. Chem. Soc. 1958, 80, 812-819, Ref. 8.
  28. Leff, H. S. Am. J. Phys. 1996, 64, 1261-1271.
  29. Pynchon, T., Entropy. In The Kenyon Review 1960, 22 (2), 27-92.
  30. Adams, B. The Law of Civilization and Decay; Knopf: New York, 1943.
  31. Adams, H. In The Degradation of the Democratic Dogma; Peter Smith: New York, 1949.
  32. (a) Rifkin, J. Entropy; Viking, New York, 1980; p 38. (b) p 205.
  33. McKinney, R. H. Phil. Today 1990, 34 (2), 163-174 (p. 170, refs 3 and 4).
  34. Planck, M. Scientific Autobiography and Other Papers; Philosophical Library: New York, 1949; pp 41-43.

 

Copyright © 2002 The Division of Chemical Education, Inc.,
of The American Chemical Society.

Corrections and Updates

  1. The description "configurational" (or "positional") entropy is seriously misleading, in that it diverts attention from the phenomenon —the motional energy of the gas in a system becoming more widely dispersed in a larger volume when the system is allowed to expand isothermally, resulting in an entropy increase. If this is called an example of “positional entropy” change,  the focus is misleadingly on the method of counting the entropy change rather than on what happens. Certainly, in the final calculation, the volume (as V2/V1)  appears to be the only variable — as it is, but only because it is the final count of how greatly the molecular motional energy has been dispersed.  It is the ratio of increase of microstates. In the cell models of statistical thermodynamics, where one molecule occupies one cell, say m (and thus represents one microstate), the only factor that changes is mN = V, dependent on the number of original and additional unoccupied cells and N. However, each unoccupied but newly accessible cell represents not just a volume of space or a “position” but an accessible microstate. Therefore, although  the final calculation appears to represent only V increase due to additional spatial “positions” , this is conceptually incorrect. The number is a measure of the probability factor in entropy change in terms of microstates. If this fact is not brought out to the student, the essential enabling factor in thermodynamic entropy change  — molecular motional energy -- is ‘hidden’ in each cell that was counted and the basic nature of the process misidentified as only “an increase in probable positions.”
  2. (a) “An entropy increase in a system involves energy dispersal among more microstates in the system’s final state” has incorrect implications. “Among” has the connotation of “shared with others” or “spread out in several …”.. The total energy of a system is always and only in one arrangement of the energies of its molecules, i.e., one microstate, at a given instant. However, in the next instant that arrangement changes to another microstate and then to another. If a system has more accessible microstates after a process than before it occurred, then at one moment the system in a certain microstate has more choices of ways in which it can be rearranged in the next instant. That is what energy dispersal means on a molecular thermodynamic scale – more microstates in which the system’s energy might be at any moment: the opposite of having fewer choices as would be the case in having the the system’s energy become more localized.
  3. (b) The phrase "(or better, groups of microstates)" should be omitted.  It is not meaningful.
  4. The  concept behind this sentence would be much better stated as "…in the larger volume the energy levels become more closely spaced — thus, the initial amount of energy spreads out among more energy levels in the larger volume without any change in temperature, no input of q.
  5. The previous two sentences would be improved by changing them to “ omitting consideration of one particle versus many and omitting the words "states" and "microstates" to yield:  "Thus, increasing the volume of a three-dimensional container in which many molecules are moving results in closer energy levels.  This volume increase allows the molecular  motional energy to be more widely dispersed because there are so many more  accessible energy levels. In turn this reseults in a great many more microstates."
  6. The word "specifically" should be replaced by "in molecular thermodynamics".
  7. The previous two sentences about the large amounts of energy that have been dispersed to form liquids from solids and gases from liquids are misleading in focusing on the energy involved in the motion of liquids and gases.  The greatest energy input, of course, is in the phase change itself, not afterward —and this  large amount of energy is required to cause  intermolecular bond breaking in the prior state.  There is no increased motion in the liquid compared to the solid at the fusion temperature because the temperature does not increase, and similarly so with the liquid compared to the vapor  at the vaporization  equilibrium temperature. Thus,  the two sentences would be better stated in one as. "Liquids have spread out the greater energy dispersed to them from the surrundings in breaking the bonds that held their particles in the solid  at the fusion equilibrium temperature , as have gases changing from liquids at the vaporization equilibrium temperature.”
  8. "microstates" should be "energy levels".   
  9. "energy dispersal to more" should be "energy dispersal that occurs when there are more".
  10. "dissipation" should be "dispersal" and thus the final sentence should read: "..dispersal of energy and to, at least, a qualitative view of a changed number of microstates in a system's macro state."

 

entropy.lambert@gmail.com
Originally published: J. Chem. Educ. 2002 79 187-192.
Last revised and updated: November 2005

 

back to entropysite