3
10/13/21 1 Lecture 15 PHYS 416 Thursday Oct 14 Fall 2021 Comments/questions on MidTerm exam Review Quiz 5 Address several issues with 3.16 Taste, smell, & μ More on 5.5 Pressure-volume diagram 5.2.1 Comment on osmotic pressure 5.2.2 Residual entropy of glasses 5.3 Entropy as ignorance Quiz 6 1 2 (5.5) Pressure–volume diagram. (Thermodynam- ics) 2 A monatomic ideal gas in a piston is cycled around the path in the PV diagram in Fig. 5.15. Leg a cools at constant volume by connecting to a heat bath at Tc; leg b heats at constant pressure by connecting to a heat bath at Th; leg c compresses at constant temperature while re- maining connected to the bath at Th. Which of the following six statements are true? c 0 P 4 h T V V 0 0 0 c P V 4 T Isotherm b a P Leg Q1 into gas Q2 out of gas Work done by gas DEgas DSgas a 0 ! " ## 0 $ ! " ## % " N&4 b ’( " ## 0 3## ! " ## ( " N&4 c (−44)## 0 (−44)## 0 - N&4 Total 0 0 3 5.2.1 Comment on osmotic pressure Consider the white and black atoms in the figure. They exert equal and opposite pressure on the barrier. Assume now this is a semi-permeable barrier, that allows for diffusion of white atoms but not for black atoms. At equilibrium, equal numbers of white atoms appear on either side of the barrier. White atoms therefore exert no pressure on the barrier. The black atoms still exert their original pressure on the barrier, but it is no longer balance by the white atoms. As a result of the change in entropy of the white atoms, a large pressure emerges on the barrier. This pressure can do work. V V Fig. 5.4 Unmixed atoms. The pre- mixed state: N/2 white atoms on one side, N/2 black atoms on the other. 4 ( ) ( ) 0 1 1 T residual liquid liquid dQ dQ S S T dt S T dt T dt T dt = - = - ò ò ! ! ! 5.2.2 Residual entropy of glasses The difference in entropy between the liquid state and the glass state: What is a glass? How is the residual glass entropy measured? How big is the residual entropy? How is it possible to measure the number of glass configurations the system did not choose? How exactly is the statistical mechanics definition of entropy related to the thermodynamics definition? S equil E ( ) = k B log Ω E ( ) ( ) thermo Q S T D = 5 In a glass, local units have closely-spaced two-level systems. We denote the energy splitting by di. As the system is cooled, at some point the local systems become randomly trapped in one level or the other. For di>>kT, they will be in the lower level. For di<<kT they will be random. Each of these will contribute kBlog2 to the residual entropy. Some energy gets trapped in systems stuck in the higher level, so not all of the energy leaves, and not all of the entropy departs. 6

Lect15 - web.ics.purdue.edu

  • Upload
    others

  • View
    5

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Lect15 - web.ics.purdue.edu

10/13/21

1

Lecture 15 PHYS 416 Thursday Oct 14 Fall 2021

• Comments/questions on MidTerm exam• Review Quiz 5• Address several issues with 3.16 Taste, smell, & µ• More on 5.5 Pressure-volume diagram• 5.2.1 Comment on osmotic pressure• 5.2.2 Residual entropy of glasses• 5.3 Entropy as ignorance• Quiz 6

1 2

Copyright Oxford University Press 2006 v2.0 --

116 Entropy

some kinds of computations than ordinary com-puters.

(5.4) Black hole thermodynamics. (Astro-physics) ©3Astrophysicists have long studied black holes:the end state of massive stars which are tooheavy to support themselves under gravity (seeExercise 7.16). As the matter continues to fallinto the center, eventually the escape velocityreaches the speed of light. After this point, thein-falling matter cannot ever communicate infor-mation back to the outside. A black hole of massM has radius45

Rs = G2Mc2

, (5.37)

where G = 6.67 × 10−8 cm3/g s2 is the gravita-tional constant, and c = 3 × 1010 cm/s is thespeed of light.Hawking, by combining methods from quantummechanics and general relativity, calculated theemission of radiation from a black hole.46 Hefound a wonderful result: black holes emit per-fect black-body radiation at a temperature

Tbh =!c3

8πGMkB. (5.38)

According to Einstein’s theory, the energy of theblack hole is E = Mc2.(a) Calculate the specific heat of the black hole.The specific heat of a black hole is negative. Thatis, it gets cooler as you add energy to it. In abulk material, this would lead to an instability;the cold regions would suck in more heat and getcolder. A population of black holes is unstable;the larger ones will eat the smaller ones.47

(b) Calculate the entropy of the black hole, by us-ing the definition of temperature 1/T = ∂S/∂Eand assuming the entropy is zero at mass M = 0.Express your result in terms of the surface areaA = 4πR2

s, measured in units of the Plancklength L∗ =

√!G/c3 squared.

As it happens, Bekenstein had deduced this for-mula for the entropy somewhat earlier, by think-ing about analogies between thermodynamics,

information theory, and statistical mechanics.On the one hand, when black holes interact orchange charge and angular momentum, one canprove in classical general relativity that the areacan only increase. So it made sense to assumethat the entropy was somehow proportional tothe area. He then recognized that if you hadsome waste material of high entropy to disposeof, you could ship it into a black hole and neverworry about it again. Indeed, given that the en-tropy represents your lack of knowledge about asystem, once matter goes into a black hole onecan say that our knowledge about it completelyvanishes.48 (More specifically, the entropy of ablack hole represents the inaccessibility of all in-formation about what it was built out of.) Bycarefully dropping various physical systems intoa black hole (theoretically) and measuring thearea increase compared to the entropy increase,he was able to deduce these formulæ purely fromstatistical mechanics.We can use these results to provide a fundamen-tal bound on memory storage.(c) Calculate the maximum number of bits thatcan be stored in a sphere of radius one centime-ter.Finally, in perhaps string theory’s first phys-ical prediction, your formula for the entropy(part (b)) was derived microscopically for a cer-tain type of black hole.

(5.5) Pressure–volume diagram. (Thermodynam-ics) ©2A monatomic ideal gas in a piston is cycledaround the path in the P–V diagram in Fig. 5.15.Leg a cools at constant volume by connectingto a heat bath at Tc; leg b heats at constantpressure by connecting to a heat bath at Th; legc compresses at constant temperature while re-maining connected to the bath at Th.Which of the following six statements are true?(T) (F) The cycle is reversible; no net entropyis created in the Universe.(T) (F) The cycle acts as a refrigerator, usingwork from the piston to draw energy from thecold bath into the hot bath, cooling the cold bath.

45This is the Schwarzschild radius of the event horizon for a black hole with no angular momentum or charge.46Nothing can leave a black hole; the radiation comes from vacuum fluctuations just outside the black hole that emit particles.47A thermally insulated glass of ice water also has a negative specific heat. The surface tension at the curved ice surface willdecrease the coexistence temperature a slight amount (see Section 11.3); the more heat one adds, the smaller the ice cube, thelarger the curvature, and the lower the resulting temperature [131].48Except for the mass, angular momentum, and charge. This suggests that baryon number, for example, is not conserved inquantum gravity. It has been commented that when the baryons all disappear, it will be hard for Dyson to build his progenyout of electrons and neutrinos (Exercise 5.1).

Copyright Oxford University Press 2006 v2.0 --

Exercises 117

c0P4 hT

VV

0

0 0

cP

V4

T

Isotherm

b

aP

Fig. 5.15 P–V diagram.

(T) (F) The cycle acts as an engine, transfer-ring heat from the hot bath to the cold bath anddoing positive net work on the outside world.(T) (F) The work done per cycle has magnitude|W | = P0V0 |4 log 4− 3|.(T) (F) The heat transferred into the cold bath,Qc, has magnitude |Qc| = (9/2)P0V0.(T) (F) The heat transferred from the hot bath,Qh, plus the net work W done by the piston ontothe gas, equals the heat Qc transferred into thecold bath.Related formulæ: PV = NkBT ; U =(3/2)NkBT ; ∆S = Q/T ; W = −

∫P dV ; ∆U =

Q + W . Notice that the signs of the variousterms depend on convention (heat flow out vs.heat flow in); you should work out the signs onphysical grounds.

(5.6) Carnot refrigerator. (Thermodynamics) ©2Our refrigerator is about 2m×1m×1m, and hasinsulation about 3 cm thick. The insulation isprobably polyurethane, which has a thermal con-ductivity of about 0.02 W/m K. Assume thatthe refrigerator interior is at 270K, and the roomis at 300K.(a) How many watts of energy leak from our re-frigerator through this insulation?Our refrigerator runs at 120V, and draws a max-imum of 4.75 amps. The compressor motor turnson every once in a while for a few minutes.(b) Suppose (i) we do not open the refrigeratordoor, (ii) the thermal losses are dominated bythe leakage through the foam and not through theseals around the doors, and (iii) the refrigeratorruns as a perfectly efficient Carnot cycle. Howmuch power on average will our refrigerator needto operate? What fraction of the time will themotor run?

(5.7) Does entropy increase? (Mathematics) ©3The second law of thermodynamics says that en-tropy always increases. Perversely, we can showthat in an isolated system, no matter what non-equilibrium condition it starts in, entropy cal-culated with a complete microscopic descriptionstays constant in time.Liouville’s theorem tells us that the total deriva-tive of the probability density is zero; followingthe trajectory of a system, the local probabilitydensity never changes. The equilibrium stateshave probability densities that only depend onenergy and number. Something is wrong; if theprobability density starts non-uniform, how canit become uniform?Show

∂f(ρ)∂t

= −∇ · [f(ρ)V]

= −∑

α

∂∂pα

(f(ρ)pα) +∂∂qα

(f(ρ)qα),

where f is any function and V = (P, Q) is the6N-dimensional velocity in phase space. Hence(by Gauss’s theorem in 6N dimensions), show∫(∂f(ρ)/∂t) dPdQ = 0, assuming that the prob-

ability density vanishes at large momenta andpositions and f(0) = 0. Show, thus, that theentropy S =

∫−kBρ log ρ is constant in time.

We will see that the quantum version of the en-tropy is also constant for a Hamiltonian systemin Exercise 7.4. Some deep truths are not mi-croscopic; the fact that entropy increases is anemergent property.

(5.8) The Arnol’d cat map. (Mathematics, Dy-namical systems) ©3Why do we suppose equilibrium systems uni-formly cover the energy surface? Chaotic mo-tion has sensitive dependence on initial condi-tions; regions on the energy surface get stretchedinto thin ribbons that twist and fold in compli-cated patterns, losing information about the ini-tial conditions and leaving many systems witha uniform distribution of probability over all ac-cessible states. Since energy is the only thing weknow is conserved, we average over the energysurface.Arnol’d developed a simple illustration of thisstretching and folding using a function takinga two-dimensional square into itself (called the“cat map”, Figure 5.16). Liouville’s theoremwill tell us that Hamiltonian dynamics pre-

Leg Q1 into gas Q2 out of gasWork done by

gasDEgas DSgas

a 0 !"𝑃#𝑉# 0 $ !

"𝑃#𝑉# −%"N𝑘& 𝑙𝑜𝑔4

b '(" 𝑃#𝑉# 0 3𝑃#𝑉# !

"𝑃#𝑉#("N𝑘& 𝑙𝑜𝑔4

c (−4𝑙𝑜𝑔4)𝑃#𝑉# 0 (−4𝑙𝑜𝑔4)𝑃#𝑉# 0 - N𝑘& 𝑙𝑜𝑔4

Total 0 0

3

5.2.1 Comment on osmotic pressure

Consider the white and black atoms in the figure. They exert equal and opposite pressure on the barrier.

Assume now this is a semi-permeable barrier, that allows for diffusion of white atoms but not for black atoms.

At equilibrium, equal numbers of white atoms appear on either side of the barrier. White atoms therefore exert no pressure on the barrier.

The black atoms still exert their original pressure on the barrier, but it is no longer balance by the white atoms.

As a result of the change in entropy of the white atoms, a large pressure emerges on the barrier. This pressure can do work.

Copyright Oxford University Press 2006 v2.0 --

104 Entropy

5.2.1 Entropy of mixing: Maxwell’s demon andosmotic pressure

Scrambling an egg is a standard example of irreversibility; you cannotre-separate the yolk from the white. A model for scrambling is given inFigs 5.4 and 5.5: the mixing of two different types of particles. Here theentropy change upon mixing is a measure of increased disorder.

VV

Fig. 5.4 Unmixed atoms. The pre-mixed state: N/2 white atoms on oneside, N/2 black atoms on the other.

V2

Fig. 5.5 Mixed atoms. The mixedstate: N/2 white atoms and N/2 blackatoms scattered through the volume2V .

Consider a volume separated by a partition into two equal volumesof volume V . There are N/2 undistinguished ideal gas white atoms onone side of the partition, and N/2 undistinguished ideal gas black atomson the other side. The configurational entropy of this system (eqn 3.55,ignoring the momentum space parts) is

Sunmixed = 2 kB log[V N/2/(N/2)!], (5.14)

just twice the configurational entropy of N/2 undistinguished atoms ina volume V . We assume that the black and white atoms have the samemasses and the same total energy. Now consider the entropy changewhen the partition is removed, and the two sets of atoms are allowed tomix. Because the temperatures and pressures from both sides are equal,removing the partition does not involve any irreversible sound emissionor heat transfer; any entropy change is due to the mixing of the whiteand black atoms. In the desegregated state,14 the entropy has increased

14This has no social policy implica-tions; the entropy of mixing for a fewbillion humans would not power an eyeblink.

toSmixed = 2kB log[(2V )N/2/(N/2)!], (5.15)

twice the entropy of N/2 undistinguished atoms in a volume 2V . Sincelog(2mx) = m log 2 + log x, the change in entropy due to the mixing is

∆Smixing = Smixed − Sunmixed = kB log 2N = NkB log 2. (5.16)

We gain kB log 2 in entropy every time we place an atom into oneof two boxes without looking which box we chose. More generally, wemight define a counting entropy:

Scounting = kB log(number of configurations) (5.17)

for systems with a discrete number of equally-likely configurations.This kind of discrete choice arises often in statistical mechanics. In

equilibrium quantum mechanics (for a finite system) the states are quan-tized; so adding a new (non-interacting) particle into one of m degen-erate states adds kB logm to the entropy. In communications theory(Section 5.3.2, Exercises 5.14 and 5.15), each bit transmitted down yourchannel can be in one of two states, so a random stream of bits of lengthN has ∆S = kSN log 2.1515The Shannon constant kS is defined

in Section 5.3.2. In more general cases, the states available to one particle dependstrongly on the configurations of the other particles. Nonetheless, theequilibrium entropy still measures the logarithm of the number of differ-ent states that the total system could be in. For example, our equilib-rium statistical mechanics entropy Sequil(E) = kB log(Ω(E)) (eqn 3.25)

4

( ) ( )0

1 1T

residual liquid liquiddQ dQS S T dt S T dt

T dt T dt= - = -ò ò

!

! !

5.2.2 Residual entropy of glasses

The difference in entropy between the liquid state and the glass state:

• What is a glass?• How is the residual glass entropy measured?

• How big is the residual entropy?• How is it possible to measure the number of glass configurations the

system did not choose?

How exactly is the statistical mechanics definition of entropy related to the thermodynamics definition?

Sequil E( ) = kB log Ω E( )( ) thermoQST

D =

5

In a glass, local units have closely-spaced two-level systems. We denote the energy splitting by di.

As the system is cooled, at some point the local systems become randomly trapped in one level or the other. For di>>kT, they will be in the lower level.

For di<<kT they will be random. Each of these will contribute kBlog2 to the residual entropy.

Some energy gets trapped in systems stuck in the higher level, so not all of the energy leaves, and not all of the entropy departs.

6

Page 2: Lect15 - web.ics.purdue.edu

10/13/21

2

The temperature associated with the non-equilibrium systems is ~d/kB.

This means the entropy that fails to flow out of the glass due to trapping in the non-equilibrium configurations is:

ΔS ∼ ΔQT∼

δ iδ ikB

⎛⎝⎜

⎞⎠⎟

= kB

This give us an equivalence between the thermodynamics definition of entropy and the statistical mechanics definition.

7

Question:

The entropy of a system is higher if there are more possible configurations it might be in.

If it were possible to do a measurement on a glass that revealed the exact state of every atom, completely specifying the configuration, would the entropy decrease?

8

5.3 Entropy as ignorance: information and memory

5.3.1 Non-equilibrium entropy

The fundamental hypothesis of statistical mechanics is that all possible states W of a system are equally likely. BUT, this only applies in equilibrium, and is not necessarily true for non-equilibrium systems. If you wait a long time after dropping an ice cube into hot tea, you will find a cup of warm tea, in thermodynamic equilibrium. Right after you dropped it, however, states with lower energy in the region of the ice cube are much more likely. We will now consider entropy for non-equilibrium systems, where all states are NOT equally likely.

9

Consider a system with M equally likely discrete states. The probability of being in any one of them is pi=1/M, and the entropy is just S(M ) = kB log(M ).

Rewrite this as: S(M ) = −kB log(1 M ) = −kB log( pi )

Now allow the probabilities to be different. The entropy will depend on the mean value of pi, averaged over all states:

Sdiscrete = −kB log( pi ) i = −kB pi log( pi )i∑

Scontinuum = −kB log(ρ) = −kB ρ log(ρ)∫

10

5.3.2 Information entropy

We will argue that entropy should be a measure of our ignorance. For example, if we know everything about the position and motion of every particle in a closed system, the entropy is zero (no ignorance).

For information entropy, we do not need physical temperature, so we replace the Boltzmann constant by kS=1/log(2):

SS = −kS pi log pi = − 1log(2)

pi log pi = pi log2 pii∑

i∑

i∑

Note: this entropy equals the number of bits: SS = log2 2

N = N log2 2 = N .

11

Applied to information, this is called Shannon entropy.Question: Is entropy a property of a system, or is it a property of you and your knowledge about the system?

Example:

Your room-mate lost their keys and is asking for your advice. Measure your room-mate’s progress in finding the keys by measuring your ignorance function S.

All of the places the keys could be are indexed A1, A2,…Ak, …AW. (The number of sites Ak equals W.) It turns out there are three essential properties of your ignorance function S:

12

Page 3: Lect15 - web.ics.purdue.edu

10/13/21

3

(1) Entropy is a maximum for equal probabilities.

If all locations Ak are equally likely, you have no idea where to start looking, which is maximum ignorance.

S1

,…, 1Ω

⎛⎝⎜

⎞⎠⎟> S1 p1,…, pΩ( ) unless pi = 1

Ω for all i

(2) Entropy is unaffected by extra states of zero probability.

If you already know the keys are not in your pockets, including those sites (with zero probability) does not change the entropy.

S1 p1,…, pΩ−1,0( ) = S1 p1,…, pΩ−1( )

13

The third and final requirement of your ignorance function involves conditional probabilities:

SI A Bℓ( )B= SI AB( )− SI B( )(3)

B = (B1,B2,…BM) are all the locations where the keys might have last been seen, each with probability qi. SI(B) is your ignorance about these sites.

SI(AB) is your joint ignorance about all the places they might be (A) and all the places they were last seen, WxM.

If we start with the joint distribution AB, and then measure B, then on average your joint ignorance declined by your original ignorance of B.

It can be shown that the Shannon entropy satisfies all three of these necessary conditions for entropy to be a valid ignorance function.

14