This Entropy section was written by Dr. Lambert for the Thermodynamics chapter of Wikibooks “General Chemistry”

Entropy in General Chemistry

Contents

  1. Introduction
  2. What is entropy, really?
  3. What is entropy good for?
  4. Entropy proves that 'heat' will always flow from a hotter body to a cooler one
  5. What is the change in entropy when a gas expands or gases mix or liquids mix?
  6. Why gases mix. Why liquids mix.
  7. What happens to the entropy of a solvent when an (ideal) solute is dissolved in it?
  8. 'Positional entropy' and thermal entropy are similar.
    'Unavailable' energy is available!

Introduction

From the preceding description of the second law of thermodynamics, you already have a good start toward understanding entropy. In this section we will go further by seeing exactly how the basic process of energy becoming dispersed or spread out in spontaneous events is measured by entropy increase.

Some recaps involving the second law plus a few new details will give us a strong foundation. First, in chemistry the energy on which we focus most is the motional energy of molecules that is responsible for the temperature of a substance or a system of chemicals. When that energy is transferred from hotter surroundings to a cooler system, it has been called “heat” energy. However, what is really happening is that faster moving molecules in the surroundings are colliding with the walls of the system and that way some of their energy gets to the molecules of the system and makes them move faster. “Heat” is really the motional energy of molecules being transferred.

Second, our important overall principle is ”Energy of all types changes from being localized to becoming dispersed or spread out, if it is not hindered from doing so. Entropy (or better, entropy change) is the quantitative measure of that kind of a spontaneous process: how much energy has been transferred/T or how widely it has become spread out at a specific temperature.

We will ignore the old error of talking about entropy as “disorder”. That has been deleted from most US university chemistry textbooks (entropysite.com/#whatsnew). Most of these texts also have adopted the description of entropy change as involving energy dispersal that we are using here.

What is entropy, really?

Entropy was first defined and used in 1865, long before the behavior of molecules was understood. Chemists and physicists had no idea that temperature was due to the motional energy of molecules or that “heat” was actually the transferring of that motional molecular energy from one place to another. Back then, entropy change, ΔS, could only be described in macro terms that could be measured, such as volume or temperature or pressure. The 1865 equation that is still completely correct is stated in most modern texts as simply ΔS = q (rev)/T. (And millions of students for almost a century and a half have silently or loudly asked the question, “What does that really mean?” and been frustrated by inadequate explanations!)

Fortunately today, a complete explanation is simple and easy to understand because we can use the old equation but interpret it in modern terms of how molecules are responsible for what is happening. Here is that equation expanded, part by part:

There…now you know more about “what entropy really is” than most beginning students have known for the past century! Entropy in a heating process is just a measure of the motional energy of molecules from hotter surroundings being transferred (reversibly) to a cooler system divided by the T at which it is transferred. . What’s so mysterious or complicated about that? (Of course, in advanced work after the first year of chemistry, entropy change can be very complex, but that is not our problem now)

What is entropy good for? To begin with, see what standard molar entropy means:

Let’s look at one aspect of the surprising power of entropy: how it quickly gives us a general idea of how much energy a substance needs to exist at a given temperature compared to other chemicals. If you are now taking a course in chemistry and have a textbook, turn to the table or short list of ‘Standard Molar Entropy Values’ for elements and compounds. (The standard temperature used in such tables is 298.15 K and a solid is (s), a liquid (l), and a gas (g). The entropies are often stated to several figures in terms of joules/K mol but let’s use whole numbers rather than precise values.)

All textbooks give you generalities, like “The liquid forms of substances have higher standard entropies than the solid, and gases have higher entropies than their corresponding liquid.” “For different substances in the same phase molecular complexity determines which ones have higher density,” Great. Some more junk to memorize, right? But why fundamentally are entropies higher or lower for various elements and compounds or in solid, liquid and gas states?

Of course, we have a start on a good answer to that question: it must have something to do with the motional energy (plus any phase change potential energy) in the substance at that temperature. However, we need one more fact about energy and entropy for a complete understanding. At absolute zero, 0 K, all substances are assumed to have zero entropy. Then, when a substance is warmed from 0 K, more and more motional energy is added to it as its temperature increases to the ‘standard entropy temp’ of 298.15 K. (From what we have already discussed, a substance such as hydrogen gas would require additional energy to change phases from a solid at 0 K to a liquid and then more to change from liquid to gas before the temperature reaches 298 K.) So, it is probable that a solid like carbon at 298 K (as a diamond) has a low standard state entropy (“So”) of 2 J/K mol whereas liquid water has a medium So of 70 J/K mol and gaseous hydrogen gas has an So of 131 J/K mol. Liquids and gases need more energy because they have had to be changed from a solid to a liquid and then to be a gas at 298.15 K and stay that way at that temperature.

There. You have a pretty good idea of why the standard entropies are larger or smaller for various substances and for various states — solid or otherwise. The larger the number for entropy in the tables of standard entropy values, the more was the quantity of energy that had to be dispersed from the surroundings to that substance for it to exist and be stable at 298.15 K rather than 0 K!

Then, it’s obvious why “liquid forms of substances have higher standard entropies than their solid form” — the solid had to be given additional energy (enthalpy) for fusion so the molecules could move more freely. The amount of energy in their ΔH/T of fusion/melting has to be added to the entropy of the solid substance. No problem. But why does pure carbon in the form of solid graphite have a higher So (6 J/K mol) than pure carbon in the form of solid diamond (2 J/K mol)? That means that graphite must need energy to move in some way that diamond can’t. Aha — diamond is totally rigid, each carbon atom tightly held in a tetrahedral framework. Graphite is totally different — its atoms are even more tightly held in layers, but those layers are like sheets of paper in a pile. The individual atoms can’t move readily, but the sheets of atoms can slide over one another a tiny bit without great difficulty. That requires some energy and therefore it means that graphite has to have more energy in it than diamond at 298.15 K.

Entropy proves that 'heat' will always flow from a hotter body to a cooler one

Here’s a question students have asked for generations, although it doesn’t bother some because it just seems like common sense. (Well, it is common experience!) “How can you prove scientifically that “heat” will always flow from a place that is hotter to one that is cooler?

Let’s use a warm metal bar touching a similar bar that is just barely cooler (both of them isolated from the surroundings so no ‘heat’ is lost). Designate q for the thermal energy/”motional energy” that will be transferred from the hotter to the cooler bar. (Actually, it is vibrational energy in the hotter bar that will be transferred .) Under these nearly reversible conditions, let us designate the temperature of the hotter bar in bold type as T and T for the temperature of the slightly cooler. Therefore, because T is larger than T, q/ T is smaller than q/T. Now, if —q/ T is transferred from the hotter bar to the barely cooler bar, the cooler would be given q/T. But since q/T is larger than —q/ T, the cooler bar has increased in entropy more than the hot bar has decreased in entropy.

Important note! Considering only the originally cooler bar as the system, the above simple calculation shows that the system has increased in entropy because energy has been transferred to it. However, considering the originally hotter bar as the surroundings and the cooler as the system, that whole small universe of surroundings and system has increased in entropy because of the energy spreading out to the cooler bar as they both reach the same temperature.

[Only for advanced students and only after reading ahead about the Boltzmann entropy equation: Since the conditions are near reversibility (and qualitatively, the same entropy increase can be shown to be true at a larger temperature difference), the foregoing conclusions can be generalized as “energy transfer from hot surroundings to a colder system always results in an increase in entropy in the cooler system.” Then, because ΔS = kB ln WFinal /WInitial , the number of accessible microstates in the cooler bar must have increased. The energy, q, didn’t change in the transfer but it is more dispersed in the cooler bar because there are more accessible microstates for the entire energy of the cooler bar in any one of which it might be at one instant. Thus, the entropy increase that we see in a spontaneous thermal energy transfer is due to increased energy dispersal in the cooler system, and in the universe of surroundings plus system, and (especially helpful to beginners because it is easily visualized) increased dispersal in three-dimensional space. This is a powerful example of the utility of viewing the nature of energy as becoming more dispersed spontaneously — if it is not constrained.

What is the change in entropy when a gas expands or gases mix or liquids mix?

In all of the preceding examination of entropy, we have been considering — essentially — the change in entropy when energy is transferred from the surroundings to a system, and we found that the traditional entropy equation of q/T aids in understanding why things happen: the less it is hindered, the more the q ‘motional energy’ tends to spread out or disperse. But what does entropy have to do with spontaneous events where there isn’t any transfer of energy from the surroundings? Why do gas molecules spread out into a vacuum? That doesn’t take any energy flowing from the surroundings. Why do gases spontaneously mix and liquids that are somewhat like each other mix easily without any help? Why do so many chemicals dissolve in water?

By now, you can hardly help guessing correctly! All these events must have something to do with energy dispersing of spreading out. The only difference between them and what we have talked about before is that they all involve a system’s initial energy spreading out within the system itself. No energy is transferred to or from the surroundings; no heating or cooling at all IF (as we always assume in our elementary approach) the substances involved are ideal gases or liquids or solids whose molecular attractions do not make the results complex.

A first and an excellent example is shown in most textbooks as two spherical glass bulbs connected with a stopcock between them. One bulb has a gas in it and the other is evacuated. Open the stopcock and the gas flows into the evacuated bulb. Then, for a paragraph or even a dozen, some texts make a big deal about how improbable it is that all the gas would flow back into the first bulb or all go into the second bulb. That should sound dumb to you because you know the basic principle: the initial motional energy of the gas will spread out as much as it can — and stay that way as the speeding molecules continually collide with each other and bounce around everywhere throughout the two bulbs. Molecules with motional energy will disperse in three dimensional space if they are not constrained, i.e., if they’re not hindered by stopcocks or small containers!

The following indented sections are only for honors students. They can be skipped by the majority of those in beginning chemistry because the preceding and following pages provide a superior introduction for starting to understand entropy change in chemistry. However, more and more texts are introducing probability and microstates ineptly or incorrectly and therefore students going on as chemistry majors should be aware of a valid approach to those topics.

Actually, thermodynamic entropy increase in chemistry is dependent on two factors. It is enabled by the inherent motional energy of molecules above 0 K (that also can arise from bond energy change in a reaction). It is only actualized if a process makes available a larger number of microstates, a maximal probability for the distribution of the molecular motional energy in the final state. Both factors are essential. Neither is sufficient by itself. calpoly_talk.html In this introduction to entropy for beginning students, I have focused on the first factor and left tacit the presence of the second, the actualization details that have always been implicit in the examples given. Now, let us look at what microstates are, and then at what “maximal probability of the distribution of energy” means.

One way of describing a microstate is that it is like an instantaneous photo of all the molecules in a system that shows each of their positions and their energies. (Then the next photo should be as impossible (!) but should be ‘snapped’ about a trillionth of a trillionth of a second later when only two molecules have collided and changed their energies. This would be another microstate. It would require many more than trillions times trillions of years of taking such photos to show the possible number of microstates in a mole of any liquid or gaseous substance at 298 K. Even at a temperature of around 4 K , there are about 10 raised to an exponent of 26,000,000,000,000,000,000 microstates in any substance. This is an unimaginable number, but it is a reliable calculation by K. S. Pitzer (See "Order to Disorder") In quantum mechanics, only the energies of the molecules are considered to be distributed on energy levels. This approach is described, with simplified diagrams, here.

In summary, a microstate is one way of a huge number of ways in which the energy of the system can be distributed on quantized energy levels. The larger the number of possible (sometimes called “accessible”) microstates that there are for a system, the greater is the system’s entropy. This is how microstates and probability are related to entropy. Mobile, energetic molecules are continually colliding seeking the most probable situation, exploring a fraction of the gigantic number of microstates that are ‘accessible’. If a process — such as expansion into a larger volume — is made available to them, an increase in entropy is actualized because in a larger volume the energy levels are closer together. Thus, without any change in their initial motional energy, it is more probable that there are more ways in which that energy can be distributed at any one instant because there are more accessible microstates. It is not that the energy of the system is in more than one microstate at one instant!! That is impossible. Rather, a greater dispersal or spreading out of energy after a process like expansion of a gas or mixing of fluids than before it occurred means that at any instant there are many many more choices of possible microstates in the next instant than were possible before the expansion or other kind of change in state.

General chemistry textbooks commonly show microstates as fixed positions of three or four molecules and then calculate probabilities to prove how more probable it is to have the molecules more widely distributed. This is misleading in that it omits why molecules should become “more widely distributed” — i.e., that they are always energetic and mobile particles! Further, it is equally misleading to treat microstates in the classic sense of just an arrangement in space. A microstate of molecular energies in quantum mechanics is a distribution of those energies on energy levels, not a static pattern. Molecules obey quantum mechanics rather than the statistical mechanics with which we may calculate.

Why gases mix. Why liquids mix.

Just as simple as the reason for gas expanding into a vacuum is the reason for two ideal gases mixing. You could use the same apparatus as the one for gas expansion. Put some red orange bromine vapor in one of the bulbs and air (nitrogen plus oxygen) in the other. Soon both bulbs will be pale red orange and, if you would analyze the composition of both bulbs you would find the identical amount of bromine and oxygen and nitrogen in them. But why? Because each molecule of each kind in the bulbs has a greater volume in which to bounce around if it is permitted to be anywhere in two bulbs rather than one. How obvious! All of the energetic colliding molecules — the ‘motional energy’ of oxygen, nitrogen, bromine and any other gas or liquid — will spread out if they are not hindered from doing so. (The same is true of molecules in solids, but they can’t move appreciably because they are really hindered by being so strongly attracted to their close neighbors!) The more sophisticated, but still easy to understand, way of describing why gases or liquids mix is “there are more ways in which the energy of the system can be distributed in a larger volume than in a smaller volume.” Or those ‘more ways’ can be considered the greater probability of a system to be in a state in which its energy is more widely distributed, in the sense of having more microstates. This modern view of ‘more ways’ fits perfectly with the predictions of a genius, Ludwig Boltzmann, who lived in the late nineteenth century, and used “W” for his ‘more ways’.

Before much was known about molecules (the greatest scientist of his day didn’t believe in them; Boltzmann himself thought that an almost infinite number of molecules might exist in a nearly infinitesimal space), before speeds or energy levels for atoms or molecules were dreamed of — Boltzmann developed the basic theory that entropy, S, was related to the number of different ways that a system of molecules could achieve a particular total energy. He said that this total of ways was the most probable state for the system and now we realize that those probable “ways” of distribution of molecular energy are equivalent to what we now call quantized ‘microstates’ . The equation that bears his name was not written (by Planck) until the year of Boltzmann’s death in 1906 and there is no evidence that Boltzmann ever saw it or ever used “Boltzmann’s constant, k”. (That constant was first used by Planck in 1900 but nobly, he never claimed it should be called “Planck’s constant”.) The Boltzmann entropy equation is simply S = k ln W, where k is Boltzmann’s constant of R/N, the gas constant divided by Avogadro’s constant.

ΔS = k ln W(final state)/W(initial state) is a most useful form of the Boltzmann entropy equation. Calculations from it most readily yield quantitative results for the kind of entropy changes that we have just been discussing: volume expansion of gases and mixing of gases or of liquids and of dissolving of solids in solvents. That last type of entropy change is surely one of the most important in general chemistry because it is the cause of colligative effects such as the raising of boiling points and depression of freezing points of solvents, as well as of osmosis. We will not develop the calculations, but it is important to realize that the concepts behind the processes are well supported by seeing entropy change as involving a spreading out of the motional energy of the substituents, the solvent and also the solid.

When liquid water and alcohol are added to one another, they spontaneously mix — as do any ‘like’ (similar to each other in chemical structure) liquids. This is true regardless of volume change because the mere presence of two substances in the same total volume involves a dispersal of the energy of each in that whole mixture — and increased dispersal and greater entropy change as the relative quantities of each liquid in the mixture increases toward a 50:50 mole ratio. This is one of several cases in which the simple view of energy becoming more dispersed when substances are mixed works — i.e., predicts the correct result of spontaneity and of entropy increase. However, the fundamental calculations are moderately complex and involve statistical mechanics in which the solution is considered to be in many ‘cells’ with each cell a different combination of molecules in the ratio of the quantities of the two liquids present. The equation for the entropy increase in the mixture uses the relative molar quantities of liquids that were mixed.

What happens to the entropy of a solvent when an (ideal) solute is dissolved in it?

When even a small amount of solid solute is added to a solvent, just as in the mixtures of liquids above, the motional energy of the individual solvent molecules (and the solute molecules) in the new solution is each more separated from its own type of molecule than before, and thus each individual molecule’s energy is more spread out or dispersed. The entropy of the solvent and the solute each increases in the solution. (The more fundamental reasoning and resulting equations from statistical mechanics are the same as described above for liquid mixtures.) If we realize that a solvent’s energy is more dispersed in a solution than when it is a pure solvent, we can see why a solvent in a solution should be increased in entropy compared to its entropy as a pure solvent. Then also, it is obvious that the entropy will be larger depending on how many molecules of solute are added to the solvent. That increased entropy of the solvent in a solution is the cause of the "colligative effects" that we study: (1) osmotic pressure, (2) boiling point elevation, and (3) freezing point depression.

Now, if the solvent tends to stay in a solution (because its energy is more dispersed there), rather than being with only with its own kind of molecules in pure solvent, it will stay in that solution if it has a ‘choice’! That means (1) if there is a membrane placed between a pure solvent and a solution containing it (and that membrane allows solvent molecules from pure solvent to go through it from the other side but not the solute molecules), pure solvent will go through the membrane to get to the solution because its energy is more spread out there. That’s ‘osmosis’, a very important phenomenon in biochemistry; (2) solvent molecules in a solution will not leave that solution to become vapor molecules in the atmosphere above a solution as readily as at the normal boiling point of the pure solvent; a higher temperature will be necessary to cause enough molecules to leave the solution to be in equilibrium with the atmospheric pressure and ‘boil’; (3) solvent molecules in a solution will not be in equilibrium with the solid phase (composed of pure solvent molecules) at the normal freezing point; a lower temperature is necessary to reduce the motional energy of the solvent molecules so that they can make intermolecular bonds to form a solid; i.e., be in equilibrium with their molecules in the solid phase), All of these colligative effects increase as the amount of solute in the solvent is increased because the entropy (i.e., energy dispersal) of the solvent increases in the solution with a greater concentration of solute.

‘Positional entropy’ and thermal entropy are more similar than different!
Entropy, ‘unavailable energy’, is available!

Two important final notes: (1) about the similarity of all kinds of entropy change — whether “heat” transfer to a system from surroundings (or the converse) or the volume expansion of a system or the mixing of fluids or solid and solvent; and (2) the meaning of “unavailable energy” and “entropy as waste heat”.

 

entropy.lambert@gmail.com
From Wikibooks “General Chemistry”
Last revised and updated: January 2011

 

back to entropysite