Upload
guoxuanchan
View
214
Download
0
Embed Size (px)
DESCRIPTION
Thermodynamics
Citation preview
Thermodynamics Y Y Shan
AP3290 105
(v) Maxwell Speed Distribution --kinetic theory of gases .
In terms of energy: the number of gas molecules in the energy range of
d+ is (Eq.7-4c on p.102):
pdemkTNZ
pdehVNa kT 32
33
3 )2()(
pi
== , Eq7-4g
where the partition function Z of ideal gas is given by (Eq.7-4d, p.103).
Since ,22
1
22
m
pmv == dpddppd sin23 = ,
In terms of speed, the number of molecules in the speed range of dvvv + is:
( )
dvvNf
dvevkT
mN
mvdmvemkTN
dppeddmkTNva
vkTm
vkTm
mkTp
)(2
4
)()(4)2(
sin)2()(
2
2
2
222/3
2223
22
0
2
0
23
=
=
=
=
= =
pipi
pipi
pipi
pi
Where
2
222/3
24)( vkT
m
evkT
mvf
=
pipi Eq7-4h
This is called the Maxwells speed distribution function for the molecules of an ideal gas.
Thermodynamics Y Y Shan
AP3290 106
From this Maxwells speed distribution function, several characteristic molecular speeds can be calculated, such things as what fraction of the molecules have speeds over a certain value at a given temperature.
When 0)( =dv
vdf, the most probable speed is obtained:
m
kTvP
2= , which is
temperature dependent.
The average speed of the molecules can be calculated by: m
kTdvvvfv 8)(0
==
(vi) The theorem of equipartition of energy
The theorem of equipartition of energy states that molecules in thermal
equilibrium have the same average energy associated with each degree of freedom of
their motion and that the energy is: , for a molecule with three degrees of
freedom, .
The equipartition results in : kTvmvmvm zyx 21
21
21
21 222
===
kTvvvmvvvmvm zyxzyx 23)(
21)(
21
21 2222222
=++=++= ,
can be obtained from the Maxwells speed distribution, shown to follow from the Maxell-Boltzmann distribution.
kTdvevkT
mvmdvvfvmvm vkT
m
23
24
21)(
21
21
0
222/3
2
0
222
==
==
Lpi
pi
Since V
NkTP = for ideal gases, we can the expression of pressure as:
)21(
32)
23(
32 2vm
VNkT
VNkT
VNP ===
2
31
vmVNP = Eq7-4i
Thermodynamics Y Y Shan
AP3290 107
7.5 Statistical entropy
In Chapter 4 (p.56), the principle of entropy increasing in classical thermodynamics, which is an alternative statement of the second law, says: The entropy of an isolated system can never decrease,
0)( = universeSSS IF , where a system and its surroundings together (``the universe'') form an isolated system, F, I denotes Final macrostate and Initial macrostate. Note it should not be confused
when we talk about entropy S and its change S . Statistical thermodynamics can explain microscopically the spontaneous increase
of entropy.
7.5.1 The most probable Macrostate, its total number of microstates, and entropy Let's consider the example of the system of 66 chips
again(indistinguishable). Imagine starting with a perfectly ordered all-blue microstate, then choosing a
chip at random, tossing it. After repeating this kind of tossing a few times, there are highly likely to be some chips in their green side, and it is nearly impossible to be still in
its all-blue state (the chance of the system remaining all-blue is only about n
21
after
times tossing. As time goes on with more tossing, the number of greens will almost certainly increase. Here are some snapshots of the system, which are taken after every 10
tossing. The number of greens, Gn ,is 0, 3 (10toss), 5(20toss), 9(30toss), 12(40toss), 15(50toss), 15(60toss), 17(70toss), 18(80toss).
Thermodynamics Y Y Shan
AP3290 108
Here is a graph of the number of greens up to over 1000 times tossing.
We see that, after the first many times tossing, the system saturates at 618 almost all of the time. These fluctuations are quite large in percentage terms, %33 , but then it is a very small system. If we now look at a larger system of 3030 chips, we see that fluctuations are still visible, but they are much smaller in percentage terms-the number of
greens is Gn saturate at 30450 , or %7 .
Since it can be calculated and proved that, for the 6X6 system, when the number of green
18=Gn , the number of microstates for the system reaches maxium:
18918
3618 10075.9)!18!18/(!36 = >=== GG nn C For a 30X30 system, 450
450900450 = >= GG nn C
The above example tells:
Thermodynamics Y Y Shan
AP3290 109
Statistically, a system is evolving from Macrostates (those 18Greenn in this example)
with less microstates (< 910075.9 ) to the most probable Macrostate (saturates at 18 Green, 18 Blue) having the largest numbers of microstates ( 910075.9 ). Or it can be said that when a system evolves from one macrostate to the other, the corresponding number of microstates will never decrease:
What has this statistical conclusion to do with entropy?
If we compare it with the principle of entropy increasing in classical
thermodynamics, saying that the system is evolving from a macrostate having lower entropy to one having higher entropy, or that the entropy of an isolated system can never decrease.
Therefore, the increase of entropy from one macrostate to the other in classical thermodynamics can be understood and correlated to the increase of total number of microstates corresponding to one macrostate to another.
Thermodynamics Y Y Shan
AP3290 110
7.5.2 The Boltzmann statistical entropy
So what is the relationship between entropy S and the number of microstates ? Do they equal to each other? No, because if we double the size of a system ( NN 2 ), the number of microstates does not increase from to 2 , but to 2 .
Obviously we can see in this example: particleparticle = 12 ln2ln . So is not an extensive quantity (see page 5), but ln is.
Deriving the Boltzmann entropy fomula: refer to the discussion on page 94, for a Boltzmann system of N particles { }ia , the total number of microstates of Boltzmann distribution(i.e. most probable distribution, having maximum ) is:
===i
iii
ii i
a
i aUaNa
gNi
, , !
!
Thermodynamics Y Y Shan
AP3290 111
Step 1:
+=
++=
+=
+=
+=
=
i iiiii
i
a
ii
iii
i
i
a
ii
ii
i i
a
ii
i
a
ii
ii i
a
i
gaaaNN
gaaaNNN
gaaNN
g!aN!
gaNa
gN
i
i
i
ii
lnln ln
lnlnln
ln)1(ln)1(ln
ln ln ln
)ln( )!ln( !ln!
!lnln
where Stirling's formula: )1(ln!ln = xxx is applied. So we obtain:
+=i i
iiii gaaaNN lnlnlnln 7.5a
Step 2: from the first law: PdVTdSdWdQdU =+= , i.e. PdVdUTdS += , The statistical internal energy and pressure can be expressed (equations 7-3a,d,
p98,p101): ln
=
ZNU , V
ZNP
=
ln , where kT/1=
( )]ln[]ln[)](ln[
)(ln
UZNkdZNUkdZNddUkTkdS
ZdNdUTdS
+=+=+=
+=
Thus we obtain the statistical entropy in respect with partition function Z:
( )UZNkS += ln Eq.7-5b Replacing Zln by:
NZNNNZZ lnlnlnlnlnln +=+=
( )
++=+= UN
NZNNkUZNkS )(lnlnln
Replacing ==i
iii
i aUaN ,
Thermodynamics Y Y Shan
AP3290 112
++=
++=
iii
iii
ii aN
ZNNkaaNZNNkS )(lnln)(lnln
From the Boltzmann distribution function(p96):,
iiii
i agNZ
ZegNa
i
lnlnln , =+=
,
So we get:
)lnlnln()ln(lnln +=
+=i i
iiiii
iii gaaaNNkaagNNkS
Compare it with Eq.7.5a, we prove:
= lnkS Equation 7-5-C
This is the famous Boltzmann statistical entropy fomula, where the Boltzmans constant,
k , is the bridge connecting the microscopic and the macroscopic worlds. It must have dimensions of entropy, Joules/Kelvin, and it turns out that the correct numerical correspondence is given by the gas constant divided by Avogadro's number:
12323
11
0
10381.1/10023.6
314.8
=
== JKmolparticles
molJKNRk
12310381.1 = JKk
Thermodynamics Y Y Shan
AP3290 113
Chapter 8 FermiDirac distribution, and BoseEinstein distribution
8.1 The Fermi-Dirac distribution
==i
iii
i aUaN ,
The Fermi-Dirac distribution applies to fermions . Fermions are particles which have half-integer spin and therefore are constrained by the Pauli exclusion principle. Fermions incude electrons, protons, neutrons.
=
=
=
iii
ii
i iii
iFD
aU
aNaga
g
,
)!(!!
1)(
)(
/)( +=
=
kTEi
i
ii
Fie
gfNfa
For F-D statistics, the expected number of particles in states with energy i is: )( ii Nfa = , where
1)( /)( += kTE
ii Fie
gf
eq8-a
Is called the The Fermi-Dirac distribution, meaning the probability that a particle occupying energy level i .
Thermodynamics Y Y Shan
AP3290 114
8.2 The Bose-Einstein distribution
==i
iii
i aUaN ,
Bosons are particles which have integer spin , such as photons , and which therefore are not constrained by the Pauli exclusion principle . The energy distribution of bosons is described by Bose-Einstein statistics.
=
=
+=
iii
ii
i ii
iiBE
aU
aNgaga
,
)!1(!)!1(
1)(
)(
/
=
=
kTi
i
ii
iAegf
Nfa
For B-E statistics, the expected number of particles in states with energy i is: )( ii Nfa = , where
Thermodynamics Y Y Shan
AP3290 115
1)( /
= kTi
i iAegf
eq8-b
Is called the Bose-Einstein distribution, meaning the probability that a particle occupying energy level i .