This edition first published 2019
© 2019 John Wiley & Sons Inc.
All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, except as permitted by law. Advice on how to obtain permission to reuse material from this title is available at http://www.wiley.com/go/permissions.
The right of Iwao Teraoka to be identified as the author of this work has been asserted in accordance with law.
Registered Office
John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, USA
Editorial Office
111 River Street, Hoboken, NJ 07030, USA
For details of our global editorial offices, customer services, and more information about Wiley products visit us at www.wiley.com.
Wiley also publishes its books in a variety of electronic formats and by print‐on‐demand. Some content that appears in standard print versions of this book may not be available in other formats.
Limit of Liability/Disclaimer of Warranty
In view of ongoing research, equipment modifications, changes in governmental regulations, and the constant flow of information relating to the use of experimental reagents, equipment, and devices, the reader is urged to review and evaluate the information provided in the package insert or instructions for each chemical, piece of equipment, reagent, or device for, among other things, any changes in the instructions or indication of usage and for added warnings and precautions. While the publisher and authors have used their best efforts in preparing this work, they make no representations or warranties with respect to the accuracy or completeness of the contents of this work and specifically disclaim all warranties, including without limitation any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives, written sales materials or promotional statements for this work. The fact that an organization, website, or product is referred to in this work as a citation and/or potential source of further information does not mean that the publisher and authors endorse the information or services the organization, website, or product may provide or recommendations it may make. This work is sold with the understanding that the publisher is not engaged in rendering professional services. The advice and strategies contained herein may not be suitable for your situation. You should consult with a specialist where appropriate. Further, readers should be aware that websites listed in this work may have changed or disappeared between when this work was written and when it is read. Neither the publisher nor authors shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages.
Library of Congress Cataloging‐in‐Publication Data
Names: Teraoka, Iwao, author.
Title: Statistical thermodynamics : basics and applications to chemical
systems / Iwao Teraoka.
Description: First edition. | Hoboken, NJ : John Wiley & Sons, 2019. |
Identifiers: LCCN 2018036497 (print) | LCCN 2019002835 (ebook) | ISBN
9781119375258 (Adobe PDF) | ISBN 9781119375289 (ePub) | ISBN 9781118305119
(hardcover)
Subjects: LCSH: Statistical thermodynamics.
Classification: LCC QC311.5 (ebook) | LCC QC311.5 .T47 2019 (print) | DDC
536/.70727--dc23
LC record available at https://lccn.loc.gov/2018036497
Cover Design: Wiley
Cover Image: Courtesy of Erika Teraoka
This book is dedicated to my wife, Sadae Teraoka.Without her encouragement, I would not have been able to finish it.
This book was born out of my long‐time wish that chemistry and chemical engineering students should learn statistical thermodynamics using a book written for them. Many of the books on that subject were written for physics students, and the contents are not appropriate for those who deal with molecules. There are a few good books that nonphysics students can rely on, but they look old‐fashioned.
Many research papers in chemistry, chemical engineering, materials science, biochemistry, and biophysics are written using the concepts of statistical thermodynamics, whether the authors of the papers are aware of that or not. The papers are mostly about molecules, and that is why molecular‐level description of quantities observed in experiments has a big presence in these areas of science. Statistical mechanics offers a right tool for that purpose.
Admittedly, the concepts of statistical mechanics are not easy to grasp. The fundamental hypotheses are philosophical, and they are translated into equations. By applying the tools of statistical mechanics to different thermodynamic systems and solving practice problems, you will be able to get a hang of the fundamental concepts. All the chapters, except for Chapter 1, have practice problems at the end. It is essential to solve them. You can find the answers at Wiley's web site: www.wiley.com/go/Teraoka_StatsThermodynamics.
One of the axioms in Confucius' famous book, the Analects, is shown below. Its translation is as follows: If you just learn and do not think, you will be left in the dark. If you just think and do not learn, you are on a precarious footing. My translation may not be authentic, but those who study statistical mechanics should be always reminded of this axiom.
There are different ways to introduce the concepts to early‐stage learners. I like the method adopted in Atlee Jackson's textbook titled Equilibrium Statistical Mechanics [1]. The partition function is derived elegantly in a simple method. Chapters 2–7 of this book follow mostly the Atlee Jackson's book.
Some of the practice problems are borrowed from a Japanese book, Statistical Mechanics [2]. Its English translation is available [3].
The prerequisites for this book are the first semester of undergraduate physical chemistry and some math (calculus and linear algebra). In calculus, series and derivatives are far more important for statistical mechanics than the integrals are. In linear algebra, 2 × 2 matrices are all that are required. Familiarity with probability theory will be a big help. If you are not strong in math, it will be a good idea to go through Appendix A, first. As the thermodynamics is the best place to practice partial differentiation, statistical mechanics is the best place to use Taylor expansion.
The construction of chapters in this book is shown here. The arrows indicate relationships of prerequisites. Chapters 1–7 cover all the concepts and tools of statistical mechanics. The remaining chapters are applications to chemical systems.
An appendix is a collection math formulas necessary for deriving the equations in this book. Physical constants are listed in Symbols and Constants.
This book does not include pairing of single‐strand DNA, liquid crystals, surface phenomena (other than adsorption), and liquid–vapor equilibria. I want to cover them in future revisions.
This book does not cover quantum statistical mechanics. The latter is most prominently applied to a system of electrons in metals and semiconductors. If you need to learn quantum statistical mechanics, you should read one of the good books written for physics students [4].
The photograph on the cover of this book was taken by Erika Teraoka in the Wiener Zentralfriedhof (Vienna Central Cemetery).
This book is accompanied by a companion website:
www.wiley.com/go/Teraoka_StatsThermodynamics
The website includes:
The numbers are from Physicists' Desk Reference [1] and other sources.
Constant | Symbol | Value |
Avogadro's number | NA | 6.022 137 × 1023 mol−1 |
Boltzmann constant | kB | 1.380 658 × 10−23 J K−1 |
Vacuum permittivity | ε0 | 8.854 188 × 10−12 F m−1 |
Gas constant | R | 8.314 510 J (K·mol)−1 |
Planck constant | h | 6.626 070 × 10−34 J s |
Speed of light in vacuum | c | 2.997 925 × 108 m s−1 |
Elementary charge | e | 1.602 177 × 10−19 C |
Section 1.1 looks at the similarities and differences between classical thermodynamics and statistical thermodynamics. Then, in Section 1.2, we see several examples of phenomena that are beautifully described by statistical mechanics. Section 1.3 lists practices of notation adopted by this book.
Classical thermodynamics, when applied to a closed system, starts with two fundamental laws. The first law of thermodynamics accounts for a balance of energy:
where the system receives heat d′Q and work d′W to change its internal energy by dU (see Figure 1.1). The prime in “d′” indicates that the quantity may not be a thermodynamic variable, i.e. not expressed as a total derivative. When the volume of the system changes from V to V + dV, d′W = −p dV, where p is the pressure.
The second law of thermodynamics expresses d′Q by a thermodynamic variable, but only when the change is reversible:
where T is the temperature. The second law introduces the entropy S.
In classical thermodynamics, we try to find relationships between macroscopic variables, S, T, U, V, and p. The equation of state is one of the relationships. We also learned different types of energy, specifically, enthalpy H, Helmholtz free energy F, and Gibbs free energy G. These measures of energy are convenient when we consider equilibria under different constraints. For example, at constant T and V, it is F that minimizes when the system is at equilibrium. Certainly, we can always maximize S of the universe (system + the surroundings), but knowing the details of the surroundings is not feasible or of our concern. Rather, we want to focus on the system, although it is the maximization of the entropy of the universe that dictates the equilibrium of the system. People have devised F for that purpose. If we minimize F of the system under given T and V, we are equivalently maximizing S of the universe. Likewise, G minimizes when the system's temperature and pressure are specified.
As you may recall, classical thermodynamics does not need to assume anything about the composition of the system – whether it is a gas or liquid, what molecules constitute the system, and so on. The system is a continuous medium; and it is uniform at all length scales, if it consists of a single phase. In other words, there are no molecules in this view.
Statistical thermodynamics, in contrast, starts with a molecule‐level description of the system – what types of molecules make up the system, whether interactions are present between molecules, and, if they are, how the interaction depends on the distance between molecules, and so on. Furthermore, statistical thermodynamics specifies microscopic states of the molecules, for example, their positions and velocities. If the molecules are monatomic, specifying the positions and velocities may be sufficient for our purposes. When the molecules are diatomic, however, we need to specify the states of rotation and vibration as well. If the molecule is polyatomic, specifying these states becomes more complicated. Even for a system of monatomic molecules, specifying the positions and velocities requires an astronomical number of variables. Typically, the number is close to Avogadro's number. Listing and evaluating all the variables is a daunting task. Fortunately, evaluating thermodynamics variables such as U, F, and G does not require all the details. It is rather the averages of the microscopic variables that count in evaluating the thermodynamic variables, and that is where statistical thermodynamics comes in.
Here, we take a quick look at some of the results of applying statistical thermodynamics to different systems.
Figure 1.2 shows how the molar heat capacity CV/n of a gas consisting of diatomic molecules changes with temperature T. There are two characteristic temperatures Θrot and Θvib (rotation and vibration). Each diatomic molecule has its own Θrot and Θvib, and some of them are listed in Table 1.1.
Table 1.1 Characteristic temperature of rotation, Θrot, and characteristic temperature of vibration, Θvib, for some diatomic molecules.
Molecule | Θrot (K) | Θvib (K) |
H2 | 87.6 | 6331 |
N2 | 2.88 | 3393 |
O2 | 2.08 | 2239 |
The molar heat capacity is 3∕₂R at T ≪ Θrot, where R is a gas constant. We see this range only for H2; for other gases, the boiling point is above Θrot. As T increases and surpasses Θrot, CV/n increases to 5∕₂R. There is a broad range of temperature that gives a nearly constant value of CV/n before it increases to 7∕₂R as T exceeds Θvib. For most diatomic molecules that are gas at room temperature (RT), Θrot ≪ RT ≪ Θvib, and that is why a gas of diatomic molecules has CV/n = 5∕₂R.
Figure 1.3 depicts the molar heat capacity CV/n of a molecular solid (nonionic), plotted as a function of temperature T. At low temperatures, CV/n ∼ T3, and increases to a plateau value of 3R as T increases. Vibration in a lattice (crystal) accounts for this heat capacity. Einstein attempted to explain the heat capacity in his 1905 paper [1]. His statistical model correctly predicted 3R, but not T3. It is Debye who explained the ∼ T3 dependence by improving the Einstein model [2].
Anything with a temperature T > 0 radiates. A blackbody is a perfect emitter of the radiation (light) and also a perfect absorber. The radiation has different wavelength components and is not visible unless the wavelength falls in the visible range, 450–750 nm. When the radiation intensity is plotted as a function of T, the curve peaks at some wavelength λpeak (see Figure 1.4). With an increasing T, λpeak moves to a shorter wavelength, and the peak intensity increases. Stars exhibit different colors, and it is due to temperature differences. The radiation from the sun peaks at around 500 nm (blue–green), since its surface temperature is around 5800 K. A red star has a lower temperature, and a white star (λpeak ≅ 300 nm) has a higher temperature.
The λpeak decreases as ∼T−1 as T increases, which is called Wien's displacement law, discovered in 1893. The profile of the spectrum has tails at both ends. The long‐wavelength tail follows ∼λ−4, and short‐wavelength tail ∼e−const./λ. The long‐wavelength tail was explained using classical electromagnetism, but it could not explain the short‐wavelength tail, or the Wien's law. Max Planck proposed a photon hypothesis – light consists of energy particles called photons, each carrying energy reciprocally proportional to λ – in 1900 [3]. He succeeded in explaining the whole radiation spectrum.
This example is more chemical than are the preceding examples. When a clean surface (glass, graphite, etc.) is exposed to a vapor, some molecules adsorb onto the surface (Figure 1.5a). The surface coverage θ (fraction of surface covered with the molecules) increases with an increasing partial pressure p of the vapor (see Figure 1.5b). The plot is called an adsorption isotherm, as it is taken at a constant temperature.
The adsorption phenomenon can be explained by reaction kinetics, but statistical thermodynamics provides a molecular‐level description of the isotherm. For example, we can estimate, from the experimentally obtained isotherm, the cohesive energy of adsorption per molecule.
A polypeptide is a polymer of identical amino acid residues. For example, poly(L‐lysine) is a polymer of L‐lysine. The polypeptide adopts a helix conformation, a random‐coil conformation, or a mixture of them (a part of the polymer chain is in helix conformation), see Figure 1.6(a). Which conformation the polymer takes depends on the environment such as temperature and solvent. Figure 1.6 (b) is a sketch for a plot of percent helix as a function of temperature. The polypeptide is benzyl ester of poly(glutamic acid), and therefore soluble in a polar organic solvent. At low temperatures, nearly all of the polymer is in a coil conformation, and gradually changes to an all‐helix conformation as the temperature rises. It may appear counterintuitive that the ordered state of the polypeptide is seen at high temperatures, rather than at low temperatures.
The helix conformation is made possible by hydrogen bonds between a donor (H atom in an amide bond) and an acceptor (O atom in the amide bond) several residues away along the chain. If there is a mechanism that supersedes this intrachain hydrogen bond, then the chain may adopt a coil conformation. The latter can occur if the solvent molecules provide a stronger hydrogen bond to the H atoms and O atoms of the amide bonds. It is a competition between the two types of hydrogen bonds that gives rise to the inverted temperature dependence of the percent helix.
You would have learned about the Boltzmann factor, e−ΔE/(RT), where ΔE is the energy difference per mole, without being shown its derivation. The Boltzmann factor appears in many different situations. It appears, for example, in the barometric formula, p(h) = p(0)e−Mgh/(RT), where p(h) is the pressure at altitude h, M is the molar mass of the gas, and g is the gravitational constant. The Debye–Hückel theory for electrolyte solutions and the Gouy–Chapman theory for colloidal suspensions also use the Boltzmann factor. In nuclear magnetic resonance (NMR), the population of an up spin (+½) of a magnetic dipole μ in magnetic field B with respect to the down spin (−½) is exp(NAμB/RT), where NA is the Avogadro's number.
Statistical thermodynamics derives the Boltzmann factor from fundamental hypotheses. We learn the hypotheses and the derivation in Chapter 4.
This section lists some practices used in this book.
For example, the molar heat capacity CV/n of a molecular solid has the low‐temperature limit of zero, but its low‐temperature asymptote is CV/n ∼ T3. The high‐temperature limit of CV/n is 3R.
Let us consider the following function f(x) for a practical example.
The small‐x limit is ln 2. There is no large‐x limit, as f(x) → ∞ as x → ∞. However, we can get the large‐x asymptote:
Figure 1.7 shows a plot of y = f(x) and its large‐x asymptote. The plot of y = f(x) runs close to the asymptote that is a straight line.
The leading term of the large‐x asymptote is x, but including the constant makes the asymptote a better approximation. The asymptote, Eq. (1.4), neglects O(e−x).