Entropy Sites — A Guide

Back to Main Site
archive

June 2013

Dr. John M. Garland, M.D. and Ph. D. from Manchester University (U.K.), has informed me that his remarkable article -- an analysis of cancer that involves consideration of energy dispersal (entropy increase) and the spread of cancer to distant sites (in fractal terms) -- has been accepted by the Elsevier journal, "Critical Reviews in Oncology and Hematology" for publication in a couple of months.  We have communicated frequently over the past two years, but of course the content and its magnitude (51 pages of ms. with 155 references) are totally his accomplishment. (The abstract is at http://dx.doi.org/10.1016/j.critrevonc.2013.04.001.)

He summarized it for me thus:

This article arose from considering that the variety of different entities thought to induce cancer is huge, but that all cancer cells energetically behave identically; they become highly motile, proliferate seemingly without control, and their internal organisation becomes increasingly degenerate. So, if this energetic behavior is universal, why is there not a universal mechanism?

Until now, cancer induction is viewed/represented as from a deranged biochemical pathway, usually by mutation, whose components are arranged linearly and end in cancer-prone targets, for example DNA replication control or cytoskeletal activity. There are problems with this, for example: on the molecular scale, how different pathways intersect; how many are needed for a given effect; how all the myriad of biochemical pathways normally “fit together”; and progression and development of the universal phenotype. The first clue to cancer universality lies in the apparent switch  from regulated  structure and behavior to generalized increased dynamic activities at structural expense; as structure degenerates, dynamic activities such as motility and migration, replication and indeed clinical poor outcomes, increase.

Fundamental to all biochemical pathways is the constant flow of energy in chemical reactions. The interaction of that flow with the environment in which they  take place is crucial and inter-dependent. Thus, building cell structure or undertaking dynamic activity are always dependent on energy flow, the greater the energy available the more likely it is that a reaction will take place. However, in the process of structure-building the overall energy available for dispersal elsewhere becomes “locked in” while in dynamic activity it is constantly being dispersed or spread out. Cells normally co-ordinate these options through numerous checks and balances – always with the possibility of random distribution of energy in or out of the cell.  Thus, an unintended/unusual shift in cellular conditions could potentially favor dynamic activity that does not favor structure-building but rather random energy dispersal -- already suggesting a universal clue to cancer.

The interior of a cell is astronomically variable and for a cell to function the  environments managing all biochemical pathways must somehow be coherently organized. Nature widely uses a very simple tool, fractal geometry, to generate re-iterated “levels” of organization and which can deal with huge spatial variation. Clouds and snowflakes are examples. All different, but overall the same shape regardless of size. If we now apply fractal geometry to the distribution of energy flow in cells, they will self-assemble into fractal “corridors” where the components/reactions are decided by the amount of energy they dissipate or minimal energy states. This fractal “map” of energy dispersal is continuously changing according to energy demand, as in melting/freezing snowflakes, formation of clouds, generation of river patterns. The significance of this organization is that all parts are dynamically seamless; new areas can be added at different levels, and there are an infinite number of components that can fit into the same “space” (in the model termed Domains which at any level are absorbed into the next tier). The video files in the on-line article illustrate the creation of fractals which can be very easily conceived as energy flows arising from domains. However, and crucially for this model, every biochemical reaction is connected, in some way by its pattern of energy flow to every other. Further, pathways need not be linearly-ordered nor consecutive and components self-assemble according to ‘normal’ energetic criteria (see reference 155, Jun & Wright, Nat. Rev. Microbiol.8, 600, 2011). This provides the key to a universal framework both for normal cell operation and also for cancer; in cancer, the fractal network of energy dispersal is simply skewed towards maximizing that energy flow randomly in dissipation  rather than involving orderly structure and function.

The final element concerns the enormous unpredictable variability within cells; there are so many things going on at once. Chaos theory deals mathematically with unpredictability, and chaos can be linked to fractal geometry through descriptions of unpredictable behaviors which influence fractal formation. If this linkage is applied to fractal entropy, cell organisation is inherently stable overall because these variables effectively cancel each other. For cancer, however, the situation is different. Cancer-inducing agents all share the property of being permanently active (i.e. they uniformly favor the random dispersal of energy rather than its use in creating structure)  whether in a suppressive or activation mode. The effect of this is to replace variability with a constant dissipative component and permanent re-alignment of energy flow toward random dispersal.  Over time the network becomes increasingly focused on dissipation and activities with the highest abilities, i.e. dynamic activity, lead to the self-generating progressive disturbances recognized in cancer.

Many aspects of the model are amenable to testing, for example using mathematical modelling or exploring how enzyme systems energetically self-assemble in complex mixes.

[In the Acknowledgements section of Dr. Garland's article, he refers to "Professor F. Lambert (Occidental College, USA)" and then concludes with "This paper is dedicated to Professor Lambert for his work on the fundamental applications of entropy."]

 

November 2012

An ex-student has just referred me to an important article by two professors in a Japanese university, http://onlinelibrary.wiley.com/doi/10.1111/j.1749-6632.2009.05166.x/full#b9. It deals with the errors of many economics experts, from Georgescu-Roegen of 40 years ago up to the present, in their naïve attempts to connect economics to thermodynamics.  Paragraph 2.6 in that article - about my view of entropy in thermo - is a neat summary.

Josh Floyd (whose important article about misapplying entropy ‘everywhere’ is cited in the Archives of this website for April 2008 – just scroll down!) had previously castigated Georgescu-Roegen on Amazon.com (awarding his book with one star – only because Amazon doesn’t permit 0 stars!):  http://www.amazon.com/The-Entropy-Law-Economic-Process/product-reviews/1583486003/ref=cm_cr_dp_qt_hist_one?ie=UTF8&filterBy=addOneStar&showViewpoints=0

Floyd also evaluated Rifkin’s “Entropy Entropy Entropy” correctly in Amazon.com with Amazon’s one star (i.e., 0 stars!) in http://www.amazon.com/Entropy-A-New-World-View/product-reviews/0670297178/ref=cm_cr_dp_qt_hist_one?ie=UTF8&filterBy=addOneStar&showViewpoints=0

Most important:  I have just discovered Floyd’s thought-provoking ‘continuing’ web page “Beyond This Brief Anomaly”.  Its address is that:  http://beyondthisbriefanomaly.org/

 

October 2012

A statement about the importance of entropy as energy dispersal in a chemical reaction is in the sixth edition of Vollhardt and Schore’s “Organic Chemistry” text:

“If the enthalpy of a reaction depends strongly on changes in bond strength, what is the significance of ΔS°, the entropy change? You may be familiar with the concept that entropy is related to the order of a system: Increasing disorder correlates with an increase in the value of . However, the concept of “disorder” is not readily quantifiable and cannot be applied in a precise way to scientific situations. Instead, for chemical purposes, Δ is used to describe changes in energy dispersal. Thus, the value of  increases with increasing dispersal of energy content among the constituents of a system. Because of the negative sign in front of the TΔ term in the equation for Δ, a positive value for Δ makes a negative contribution to the free energy of the system. In other words, going from lesser to greater energy dispersal is thermodynamically favorable.

What is meant by energy dispersal in a chemical reaction? Consider a transformation in which the number of reacting molecules differs from the number of product molecules formed.  For example, upon strong heating, 1- pentene undergoes cleavage into ethene and propene. This process is endothermic, primarily because a C–C bond is lost. It would not occur, were it not for entropy. Thus, two molecules are made from one, and this is associated with a relatively large positive Δ. After bond cleavage, the energy content of the system is distributed over a greater number of particles. At high temperatures, the –TΔ term in our expression for Δ overrides the unfavorable enthalpy, making this a feasible reaction.”

 

September 2012

Professor Leff has not only completed the publication of his five articles in the Physics Teacher mentioned below, but made them fully available online at http://www.csupomona.edu/~hsleff/selpubs.html.

Already, one author of a physics text has stated that he will use Leff's approach to entropy in it.

 

February 2012

Professor Leff’s articles

Please click the Archive section of this site for June 2008 for a full introduction:

Physicist Harvey S. Leff (Professor Emeritus of the California State Polytechnic University, Pomona)  published the first article that quantitatively presented the idea of entropy involving the dispersal or spreading of a system’s energy, with entropy as a measure of such spreading, in 1996  (Am. J. Phys. 1996, 64, 1261-1271).  When I learned about it in 1999, I inserted a Note to it in my “Entropy Is A Cracked Crutch” ms. (published in February 2002) and gave more proper credit as the lead reference in my next article, “Entropy Is Simple, Qualitatively” (October 2002),

Now Professor Leff has begun a series of five articles in the Physics Teacher. As might be expected, because they are intended for physicists, they will be far more rigorous mathematically and theoretically than my qualitative approach to entropy.  The first article appeared in the January 2012 issue: H. S. Leff, Phys. Teach., “Removing the Mystery of Entropy and Thermodynamics --- “Part I”, 50, 28-31 (2012); in February appeared “Part 2”, 50, 87-90 (2012). The subsequent Parts will be published monthly

 

Online article – why the concept of entropy was considered so difficult – and why it isn’t.

In October, the editors of an online scientific journal invited me to develop a guest editorial.  It is online at lambert2011.pdf. I chose what might appear to be an egregious title, "The Conceptual Meaning of Thermodynamic Entropy in the 21st Century", but it is not.  As stated in the last lines, a majority of the future leaders in US chemistry by mid-century will have been introduced to entropy via this concept of its being a measure of energy dispersal.

The initial goal of the brief article was to explain why brilliant physicists and chemists of the past century failed to explain entropy clearly - i.e., failed to develop an adequate conceptual explanation for the success of dS = dq/T. Certainly, the ‘driving force’ in this relationship is simple: the nature of q, energy.  A common property of energy is that all types – including molecular energy -- tend to spread out, disperse in space if constraints are lessened or removed. It was simply unfortunate that Boltzmann's "disorder" comments of 1898 became the dominant “meaning” of entropy for so long.

This journal reports the number of downloads of its articles on its index page.  Surprisingly, the number for this editorial exceeded 1300 by late February.  (It is a legitimate count !  I had suspicions that the Journal may have manipulated them, but a friend who can access such arcane matter sent me a list of 800 when I asked him.)

 

January 2012

As I mentioned  two years ago, a truly excellent text, the 2nd edition of Burdge’s “Chemistry” was published.  Now, with Dr. J. Overby, Burdge’s  text has been revised to present an ’atoms first’ approach for those who prefer it..
Both texts are unusually good in their depiction of entropy. Not only have all references to entropy as “disorder” been eliminated but, far beyond this, “what entropy is” is uniquely handled.

Featured throughout each text are 16 two-page “Visualizing Chemistry” (large drawings/illustrations) of important points, many extended to online animations available to student users.  That feature in the thermodynamics chapter depicts six examples of system change that are correlated with entropy change and ‘particles in a box’ energy levels – volume increase, temperature increase, molecular complexity change, molar mass, phase change, and chemical reaction.  Many good texts only attempt to show such differences via energy levels for volume expansion and temperature increase.  None supplement with animation as do Burdge or Burdge and Overby..

Further, I am impressed by the wide utility of the text for a large span of students’ competence – specifically re my focus on the presentation of entropy.  Due to information I have received from instructors in the classroom, almost all students readily accept the concept of entropy as measuring how widely spread in space or in additional particles, the energy of molecules (or that which is supplied to them) becomes. However, too many students of average ability ‘get stuck’ when microstates are introduced in any meaningful detail.

Burdge barely mentions the word, microstate – her ‘particle in a box’ energy levels are left at that, as greater spreading out of energy – volume increase, closer levels; temperature increase, more occupancy of higher levels, etc.  I think this is optimal, because it is clear over the whole span of  students’ abilities in gen chem.  (And, if desired, in a very short discussion only for students going on to physical chemistry, a basic relation of numbers of energy levels/microstates and entropy change can be elucidated.)

 

November 2011

An article about carbon nanotubes – tiny hollow tubes of carbon atoms (ca. 25 to 100 thousandth of the thicknessof a human hair) – whose properties of rapidly absorbing water were found to be related to entropy and enthalpy changes appeared in a recent issue of the Proceedings of the National Academy of Sciences.   (The complete article is available via http://www.pnas.org/content/108/29/11794)
The PNAS abstract below uses the words entropy and enthalpy in a traditional manner.  Following it, my expansion of the abstract in terms of viewing entropy as energy spreading out to the surroundings due to hydrogen bonds being broken when water molecules move into minute CNTs (and seeing that enthalpy increases in one particular CNT size because stronger hydrogen bonds are formed) – more clearly explains exactly what happens.

Entropy and the driving force for the filling of carbon nanotubes with water

Tod A. Pascala,bWilliam A. Goddarda,b,1,  and Yousung Junga,1 a Graduate School of Energy, Environment, Water, and Sustainability, Korea Advanced Institute of Science and Technology, Daejeon 305-701, Korea; and b Materials and Process Simulation Center, California Institute of Technology, Pasadena CA 91125 Contributed by William A. Goddard, May 25, 2011.

Abstract

The spontaneous filling of hydrophobic carbon nanotubes (CNTs) by water observed both experimentally and from simulations is counterintuitive because confinement is generally expected to decrease both entropy and bonding, and remains largely unexplained. Here we report the entropy, enthalpy, and free energy extracted from molecular dynamics simulations of water confined in CNTs from 0.8 to 2.7 nm diameters. We find for all sizes that water inside the CNTs is more stable than in the bulk, but the nature of the favorable confinement of water changes dramatically with CNT diameter. Thus we find (i) an entropy (both rotational and translational) stabilized, vapor-like phase of water for small CNTs (0.8–1.0 nm), (ii) an enthalpy stabilized, ice-like phase for medium-sized CNTs (1.1–1.2 nm), and (iii) a bulk-like liquid phase for tubes larger than 1.4 nm, stabilized by the increased translational entropy as the waters sample a larger configurational space. Simulations with structureless coarse-grained water models further reveal that the observed free energies and sequence of transitions arise from the tetrahedral structure of liquid water. These results offer a broad theoretical basis for understanding water transport through CNTs and other nanostructures important in nanofluidics, nanofiltrations, and desalination.

My ‘translation’ of the above abstract of fundamental CNT research – in terms of entropy as a measure of energy spreading out in the surroundings in a process and enthalpy as energy becoming incorporated in stronger chemical bonds in a process:

Water spontaneously flows into extremely small carbon nanotubes (CNTs)   --  but why should it?  What is the ‘driving force’?  Carbon does not attract water!

Each molecule of ‘bulk’ or ordinary water (or ice) is about 0.3 nm (nanometers) in diameter and is strongly attracted to two other water molecules by 4 hydrogen bonds, HBs.

Two HBs are due to the H atoms of a particular water molecule being attracted to the (lone electron pairs) of another water molecule. Two additional HBs are formed between the lone pairs of that ‘particular’ water molecule, acting as acceptors in being attracted to the H atoms of a third water molecule. However, all water molecules are rapidly and ceaselessly breaking such bonds while instantly forming similar hydrogen bonds with other water molecules as they continually move in any body of water – whether a large tank, a beaker, or a thimble.

         Of course, there is no attraction between water molecules and the carbon walls of a CNT, so any water molecules within a CNT that are touching the walls cannot be attracted to them as they were previously hydrogen-bonded to adjacent water molecules.  Thus, each water molecule next to the CNT walls must lose one or two hydrogen bonds in entering the tube – and the amount of energy in that loss as similar water molecules enter must be spread out to the surroundings:  to the tube walls, to outside molecules.  Clearly, this process of greater energy dispersal in space compared to bulk water is favoredit is an increase in entropy.


         Because of the size restrictions inside the smallest CNTs, e.g., those with an 0.8 nm diameter, only about “two plus” water molecules per cross-section of the CNT, with three or fewer such “cross-sections” per nm of the CNT length is probable.  This means that the energy of some 12 hydrogen bonds in the water per nm of the length of the tube, must be dispersed from bulk water if/when it moves into such a CNT – a comparatively large spreading out of energy as/after water molecules enter, and thus a very favored process energetically:  the cause of spontaneous movement of water into 0.8 nm CNTs is an entropy increase.


         Also, because there are fewer H-bonds on each water molecule – i.e., considerably less attraction between each molecule in the chain/line/assembly of water molecules in the CNT than in “bulk water” – those water molecules in an 0.8 nm CNT are slightly freer to move than in bulk water, a little bit more like a gas than like a liquid (i.e., a “vapor-like phase” in the scientific abstract).


         Surprisingly, a slight difference in diameter causes quite different arrangements for the water molecules.  In 1.1-1.2 nm diameter CNTs, the water that enters and is ‘lined up’ the length of the tubes has enough space for more molecules per nm of tube length (but more hydrogen bonding between adjacent molecules than in 0.8 nm tubes).  Thus, their limited motion due to slightly more hydrogen bonding causes them to line up (“stack up”) less freely than in bulk water -- almost as though they were in an ice-like structure.  But such firmer bonding arrangements compared to bulk water is an enthalpy increase.  Again, although here aided by stronger bond formation rather than only HB breakage, movement of water molecules into a 1.2-1.2 nm diameter CNT is energetically preferred to their remaining in bulk water.


         In CNTs larger than 1.4 nm, relatively fewer molecules -- but still a very substantial number --  have some of their hydrogen bonds broken as they enter  >1.4 nm CNTs compared to smaller CNTs.  Therefore, despite a lesser quantity, this decrease in inter-molecular bonding results in the dispersal of some bond energy to the surroundings – i.e., an increase in entropy of the ‘universe’ of surroundings and system, when water molecules enter the larger CNTs. 

 

April 2011

I worked hundreds of hours on several sites in Wikipedia in 2006 and some in 2007. However, because no one can put their own bio in Wikipedia, it is quite an honor that an important Wikipedia administrator has installed my bio -- with the only recent picture that I have -- too much of a smile and all. Click on:  Wikipedia Frank L. Lambert.

 

July 2010

It was just reported to me that the Dean of a university wrote a blog item in 2009 about my ‘Gutenberg Method' of teaching referring way back to its being featured by the editor of the Journal of Chemical Education in 1963. JCE1963.pdf  Below is an excerpt, with corrections and citations added by me.


Wednesday, February 04, 2009

David Eubanks
Dean of Academic Support Services, Johnson C. Smith University

Earlier this week, I wrote about two physics instructors who use vodcasting as a technique to replace traditional lectures with a more engaged classroom experience. I came across another article today (again on Reddit) from the 1960s 1986, where a Professor Morrison lays out his case for what he calls The Gutenberg Approach.  This piece is paired with a letter from 2008 an editorial from the Journal of Chemical Education of 1963 about the discovery and use of this method, where Frank L. Lambert gives the following summary of the problem as he sees it:

The lecture system was crazy for teaching organic chemistry. What are professors doing in a lecture? They’re outlining and explaining the important points (and wasting time mentioning even obvious points) of the text on the blackboard. But why? Gutenberg invented movable type. That made printed textbooks available 500 years ago — even now in chemistry rather than alchemy! Students don’t read them? Of course not, if the whole course is dependent on what the prof puts on a blackboard! Students can’t pick out the most important ideas and facts from a 500-page text (in 1948, or thousand-page now) by themselves. They’re beginners.

The solution to the lecture/textbook problem is summarized by Dr. Lambert thus:

[W]hy not give them something a bit better than the [class] notes on the day or the week before the class, not really an outline of the text but more of a guide to what’s important and what’s not in each day’s text assignment. Then the students could read a day’s assignment and know what to look out for as the key points, realizing that the professor is not going to outline it on the board. Instead, she or he will explain in detail a few complex things in the assigned pages, answer any questions about them, and show how to conquer problems like those in the text, always open to questions and for back and forth with student.

(The discovery of 'The Gutenberg Method' is attached.)

 

Robert T. Morrison, co-author of the leading organic chemistry text for some 30 years, died in April. (We were friends in our graduate school days at the University of Chicago.)  If you teach organic or have taken a course in it, you should read his lecture against lecturing (as mentioned by Dr. Eubanks, previously) — the funniest ever given seriously at a University of Chicago conference, or any other!

February 2010

The 2nd edition of Burdge’s Chemistry was published in late January.  It is truly excellent. Not only have all references in the previous edition to entropy as “disorder” been eliminated but, far beyond this, the introduction to “what entropy is” is superbly handled. Featured throughout the text are 16 two-page “Visualizing Chemistry” (large drawings/illustrations) of important points, almost all extended to online animations available to student users.  That feature in the thermodynamics chapter depicts seven examples of system change that are correlated with entropy change and ‘particles in a box’ energy levels – volume increase, temperature increase, greater mass of molecules, lesser or greater structural complexity (i.e., the latter with more energetic modes), etc.  Most good texts only attempt to show such differences via energy levels for volume expansion and temperature increase.  None supplement with animation.

Further, I am impressed by the wide utility of the text for a large span of students’ competence – specifically re my focus on the presentation of entropy.  Due to information I have received from instructors in the classroom, almost all students readily accept the concept of entropy as measuring how widely spread in space the energy of molecules becomes. However, too many of average ability ‘get stuck’ when microstates are introduced in any meaningful detail.

Burdge does not go into microstates at all – her ‘particle in a box’ energy levels are left at that, as greater spreading out of energy – volume increase, closer levels; temperature increase, more occupancy of higher levels, etc.  I think this is optimal, because it can become clear for all levels of ability in a class, especially with Burdge’s visual aids.  Then, an hour or two only with those students going on to physical chemistry (linking energy dispersal in space with the increased number of accessible microstates and the Boltzmann equation, plus the Boltzmann with Clausius’ entropy) is adequate preparation for chemistry majors’ later course.

 

January 2010

Entropy Re-mystified?

I have finally read a book that I was told had become popular, “Entropy Demystified”.  It is a 217-page disaster to anyone wanting to understand entropy and the second law.  Most of the lengthy evaluations that praise the book on Amazon.com seem to have been written by the author's best friends, several being mature physicists – rather than by persons trying to understand entropy for the first time.  The following is more objective than those "reviews":  giving it a rating of “no stars” out of a possible five.  (However, Amazon.com, for a reason you might guess J, increased that rating to “one star”.)

Fifty years ago, Arieh Ben-Naim, as every student in a physics or chemistry class of that era, was mystified by his introduction to entropy and the second law of thermodynamics.  Although he became a professor of chemistry at the University of Jerusalem before retiring 15 years ago, Ben-Naim has evidently not kept up with the teaching of those topics in current chemistry texts.  Thus, he seems unaware that most general chemistry texts currently published in the US (16) and three in physical chemistry now clearly and simply present entropy and the second law (Check “May 2009” in this website).

Therefore, his 217 pages of “Entropy Demystified” that are necessary to develop his personal viewpoint (an information theory variant, not present in any US undergraduate chemistry textbook) can be clarified by 3-4 pages in each of the chemistry texts listed in this site at “May 2009” with their ISBN numbers.

In fact, a conceptual summary of the second law and entropy for all chemistry students and many non-scientists can be abstracted from these texts in two sentences:  “Energy of all types in chemistry changes – if it is not hindered – from being localized in one volume to becoming more dispersed, spread out, distributed in space (and abstractly at one instant, in any one of many more energy quantum states, microstates, than were accessible before the change).”  Then, “entropy change is the quantitative measure of how much more widely distributed the initial energy becomes in a spontaneous process in chemistry.”  Thus, in real processes, energy literally spreads out in space, and abstractly, at each instant, is in one microstate of a maximally probable number of quantum states (microstates) that are consistent with a final macrostate at equilibrium.

Unfortunately, Professor Ben-Naim’s fundamental error, summarized on page 204 but vitiating all previous pages, is his misinterpretation of what happens in real systems of molecules, especially in the simple isothermal expansion of ideal gases or in their mixing or expansion.  These cases have misled him to focus on the lack of change in the total energy of the system, rather than on what is actually the fundamental cause of all thermodynamic entropy change in chemistry:  the increased spreading of the initial energy of actual molecules in space when constraints are removed – e.g., their spontaneously moving into a greater volume from a smaller volume (with unchanged energy) in a process such as expansion or mixing.  This is what traditional thermodynamic entropy readily measures and, as just stated, can be readily understood.

Ben Naim admits, in italics, the disconnect between information and the second law on page 203 of “Entropy Demystified” by writing “a measure of information cannot be used to explain the Second Law of Thermodynamics.”  This is true, indeed. The connection between the second law and information is tenuous. 

Contrast this with the modern view in beginning collegiate chemistry texts, e.g. “whenever a product-favored chemical or physical process occurs, energy becomes more dispersed...This is summarized in the second law of thermodynamics, which states that the total entropy of the universe ... is continually increasing.” (Moore, Stanitski, and Jurs; 3rd edition.)  A popular physical chemistry text that is used world-wide states “...the Second Law of thermodynamics, may also be expressed in terms of another state function, the entropy, S. ...entropy...is a measure of the energy dispersed in a process...” (Atkins and de Paula, 8th edition.)

The connection between the second law, spontaneous chemical reactions or physical processes, dispersal of energy, and entropy is integral, tight, and widely accepted in chemistry books.  It does not require 200 pages for its justification.


August 2009

In “what’s new” for August 2007 I described my article that showed how texts that introduced ‘positional’ (configurational) entropy to students would totally mislead them:  beginners are taught that “matter tends to become dispersed” and that there are two “types” of entropy rather than one.  Equally disastrous to students’ understanding is a focus on the ‘probability’ of molecules’ positions as the sole factor in entropy increase.

[Entropy increase is first enabled by molecular motional energy (rapidly moving or vibrating molecules); only then is entropy increase actualized by the probability of a maximal dispersal/distribution of that energy – in space, within each microstate of a greater number of accessible microstates.]

A far more fundamental article by Professor E. I. Kozliak has just been published in the September issue of the Journal of Chemical Education, “Overcoming Misconceptions about Configurational Entropy in Condensed Phases”.   (He had previously resolved the old problem of incorrectly understanding “residual entropy” as simply due to molecules’ locations in space.)


May 2009

A minority of US general chemistry texts for majors still describe entropy in terms of “disorder” – an unfortunate subjective concept whose source appears to be a naïve statement by Boltzmann (boltzmann.html).  Now, however, most  ‘gen chem’ texts have discarded this non-scientific view and describe both entropy (e.g. standard molar entropy) and entropy change as measuring the result of energy becoming dispersed in physical or chemical processes – literally spreading more widely in space, while abstractly dispersing on additional energy levels in a conventional “particle in a box” diagram of one microstate.  (The latter, of course, then directly implies a greater number of microstates, W, in any final macrostate.)

It was eight years ago that the ms. outlining the above approach was accepted for publication (that now, revised and corrected, is available at this site: entropy_is_simple.

April 2009

There have been some noteworthy improvements in texts’ treatment of entropy in terms of energy dispersal.  A few will be mentioned here.  In May will be listed the 21 chemistry texts that no longer define entropy as “disorder” but rather emphasize molecular energy dispersal, concretely in space or abstractly on more energy levels in each microstate, as a useful approach to understanding standard entropy and entropy change.

Physical Chemistry
General Chemistry
General Chemistry for non-majors

March 2009

January 2009

November 2008


August 2008


July 2008

June 2008


April 2008

October 2007


August 2007


June 2007

April 2007

March 2007


November 2006


June 2006

March 2006

January 2006

December 2005

First-year college chemistry textbooks since about 1960 have used the 1898 description of thermodynamic entropy as “disorder”. In the February 2002 issue of the Journal of Chemical Education I showed that treating entropy change as “disorder” was not based on modern science and could mislead students. In the October 2002 Journal I urged that entropy be presented as the quantity of dispersal of energy/T or by the change in the number of microstates.

Textbooks do not alter their presentation of basic concepts readily nor rapidly. Thus, for the following 15 texts to delete “entropy is disorder” from their new editions within three years of my calling for such a drastic change is perhaps without precedent. Further, for all of them now to describe the meaning of entropy in various terms of the spreading or dispersing of energy (in some, quantified by Boltzmann's number of microstates) shows the utility of this concept in good teaching.

Textbooks for science majors

  1. Moore, Stanitski, and Jurs' "Chemistry: The Molecular Science", whose first edition was the best-selling new text in a decade, has a 2005 2nd edition (Thomson). The authors state that the new edition is improved because, among other features, the "...treatment of entropy in Chapters 14 and 18 has been rewritten to make it clear that entropy measures dispersal of energy" rather than "disorder". This text most thoroughly and most extensively applies my concept of "follow the energy flow" in aiding students to understand the concept of entropy.
  2. Silberberg, in the 2006 4th edition (McGraw-Hill) of his #1 or #2 best-selling "Chemistry" writes, "[The thermodynamics chapter] has been completely rewritten to reflect a new approach to the coverage of entropy. The vague notion of "disorder". has been replaced with the idea that entropy is related to the dispersal of a system's energy." and acknowledges my advice.
  3. The 3rd edition (Wiley, 2000) of Brady & Senese's "Chemistry" for science majors used "disorder"/order 65 times to describe entropy. However, in the 2005 4th edition Senese told me that "disorder" is entirely omitted. In featuring their improvements for this edition, the authors state "We have changed our approach to presenting Thermodynamics... [by explaining] entropy as a measure of the number of equivalent ways to spread energy through a system."
  4. Oxtoby, Gillis and Nachtrieb's 5th edition of "Principles of Modern Chemistry" (Brooks/Cole) has removed any references to entropy as a measure of "disorder" that appeared in the 4 th edition. This text's relating of entropy increase to greater numbers of microstates as shown by the Boltzmann entropy equation is perhaps the most thorough in any general chemistry text.
  5. Petrucci, Harwood and Herring in the 8th edition of "General Chemistry: Principles and Modern Applications" (Prentice-Hall) have an unusually readable development of entropy as increasing when there are more microstates among which the energy of a system can be distributed. This is accompanied by a simple introduction to increased density of energy levels (and therefrom, more microstates) when the volume of a gas spontaneously increases.
  6. The 2005 4th edition of Hill, Petrucci, McCreary and Perry's "General Chemistry" (Prentice-Hall) still employs the word "disorder" in referring to entropy change in several places, but it is primarily as a bridge for those students who have heard the expression. Overall, the authors use my approach to entropy change as a dispersal of energy.
  7. The 2005 8th edition of Ebbing and Gammon's "General Chemistry" (Houghton Mifflin) includes some references to "disorder" in their treatment of entropy, but they emphasize that, fundamentally and scientifically, entropy involves energy dispersal as a function of temperature.
  8. Ebbing, Gammon, and Ragsdale's 2006 (Houghton-Mifflin) "Essentials of General Chemistry" (785 pages rather than the 1200 in Ebbing and Gammon) has a similar treatment of entropy to the larger text, an emphasis on energy dispersal as essential to understanding entropy change.
  9. Moog, Spencer and Farrell (Houghton Mifflin) have developed three paperbacks as a novel “Guided Inquiry” technique in areas of physical chemistry. Their 2004 “Thermodynamics” completely omits the references to “disorder” of “messy desks” in a previous trial edition and replaces them with viewing entropy as related to how energy can be spread out in a system.
  10. A new text, “Physical Chemistry for the Life Sciences” by Atkins and de Paula (Freeman, 2006) omits the definition of entropy as disorder that was present in Atkins’ previous general chemistry and physical chemistry textbooks. Repeatedly, the emphasis in describing entropy change is on the dispersal of energy in the process. However**
  11. The novel approach by Bell and his ten collaborators uses simple experiments or thought-experiments of “Investigate This” in developing concepts in “Chemistry: A General Chemistry Project of the American Chemical Society” (Freeman, 2005). Disorder is ignored as a definition or code word for entropy. Rather the student is led to consider arrangements of molecular energy in developing the Boltzmann relation. However**
  12. The new 6th edition of “Chemistry and Chemical Reactivity” by Kotz, Treichel and Weaver (Brooks/Cole, 2006) have deleted their description of entropy increase as disorder that was in previous editions. They state that “spontaneous change results in dispersal of energy”. However**
  13. Although previous editions of Olmsted and Williams “Chemistry” had 89 uses of “disorder” vs. “order”, including the definition of entropy, the 2006 4th edition (Wiley) defines entropy only in terms of energy dispersal. The word “disorder” is rigorously avoided in any context. However**

    **However, the preceding four texts each have the unfortunate concept of “the dispersal of matter” as though there were no motional energy considerations associated with such dispersal (as in gas expansion, or any type of mixing wherein the initial motional energy of the molecules becomes more widely dispersed in space). One even states that “Things tend to become dispersed.” The source of this error is dealt with here.
  14. Brown, LeMay and Bursten's 2003 9th edition (Prentice Hall) defined entropy only as "disorder". In a preliminary ms. of the thermodynamics chapter in their 2006 10th edition all references to "disorder" were eliminated by one of the authors and the concept of energy dispersing or 'spreading out' more for increased entropy was used throughout. Although the published 10 th edition presents energy dispersal as a view of entropy, it includes the "extent of randomness" as equal, later stating “Each of these descriptions [of entropy] (randomness, disorder, and energy dispersal) is conceptually helpful if applied correctly.” This ‘trifecta' is an insurmountable challenge to beginning students who are readily confused even by a singular presentation of the concept.

Textbooks for non-science majors

  1. The first edition of Suchocki's "Conceptual Chemistry" (Benjamin Cummings) introduced the second law as "Order Tends to Disorder". His 2nd edition (2004) does so as "Entropy Is a Measure of Dispersed Energy"..."This fits with our everyday experience...." Then, with ΔSoReact, Suchocki can lead even this group of students to understand the direction of chemical reactions.

 


older news

August 2005


July 2005

November 2004

October 2004

March 2004

2003

Articles