Section 3 - Statistical Mechanics

25
Statistical Mechanics How to Predict Macroscopic Behaviour from Microscopic Laws 1. General Formalism and Definitions 1.1 States Macrostate State of a system specified by a few macroscopically observable quantities, e.g. 1kg of water at 1atm pressure and 300K. Typical quantities are: Energy/Temperature, Volume/Pressure, Number of particles (U, T, p, V, N) Microstate Full description of the system at some fundamental level, e.g. Positions and momentum of all atoms in a classical system (r 1 , r 2 , …, v 1 , v 2 , …) or a wavefunction, ψ, in quantum mechanics. The aim : To deduce properties of the macroscopic from knowledge of the microscopic. The problem : A system of N A ~10 23 particles require us to consider ~ 10 A N microstates. We never know what microstate a real system is in. The solution : Consider an ensemble; many copies of the system, all in the same macrostate but typically in different microstates. Take an average over the ensemble. 1.2 Assumptions We assume : 1. Conservation of Energy 2. Existence of Microstates 3. Principle of equal a priori probabilities (PEAPP). For any closed system at eq m , all accessible microstates are equally likely. Closed : No exchange of energy, particle or anything else with the outside world. Eq m : No flow of heat/particles etc. Accessible : Compatible with macroscopic constraints (e.g. fixed volume) Equally Likely : Occur equally often in the ensemble (i.e. with equal probability) 1.3 Example System of QHO’s. A QHO has certain allowed volumes of the energy, E n = (n + ½)ℏω, n = 0, 1, 2 The microstate of the QHO is specifired by n , the number of quanta in the oscillator. Consider a system of N identical QHO’s, all exchanging energy with each other. The microstate is specified by the number of each quanta in each oscillator, (n 1 , n 2 , …, n N ). The macrostate is given by the total energy, U, or equivalently the total number of quanta, M

Transcript of Section 3 - Statistical Mechanics

Page 1: Section 3 - Statistical Mechanics

Statistical Mechanics

How to Predict Macroscopic Behaviour from Microscopic Laws

1. General Formalism and Definitions

1.1 States

Macrostate – State of a system specified by a few macroscopically observable quantities, e.g. 1kg of

water at 1atm pressure and 300K. Typical quantities are: Energy/Temperature, Volume/Pressure,

Number of particles (U, T, p, V, N)

Microstate – Full description of the system at some fundamental level, e.g. Positions and

momentum of all atoms in a classical system (r1, r2, …, v1, v2, …) or a wavefunction, ψ, in quantum

mechanics.

The aim: To deduce properties of the macroscopic from knowledge of the microscopic.

The problem: A system of NA ~1023 particles require us to consider ~10 AN microstates. We never

know what microstate a real system is in.

The solution: Consider an ensemble; many copies of the system, all in the same macrostate but

typically in different microstates. Take an average over the ensemble.

1.2 Assumptions

We assume:

1. Conservation of Energy

2. Existence of Microstates

3. Principle of equal a priori probabilities (PEAPP). For any closed system at eqm, all accessible

microstates are equally likely.

Closed: No exchange of energy, particle or anything else with the outside world.

Eqm: No flow of heat/particles etc.

Accessible: Compatible with macroscopic constraints (e.g. fixed volume)

Equally Likely: Occur equally often in the ensemble (i.e. with equal probability)

1.3 Example

System of QHO’s. A QHO has certain allowed volumes of the energy, En = (n + ½)ℏω, n = 0, 1, 2

The microstate of the QHO is specifired by n , the number of quanta in the oscillator.

Consider a system of N identical QHO’s, all exchanging energy with each other.

⇒The microstate is specified by the number of each quanta in each oscillator, (n1, n2, …, nN). The

macrostate is given by the total energy, U, or equivalently the total number of quanta, M

Page 2: Section 3 - Statistical Mechanics

e.g. Macrostate with M = 1, system with N = 5

5 possible macrostates:

n1 = 1, n2 = 0, n3 = 0, etc.

n1 = 0, n2 = 1, n3 = 0, etc.

etc.

if M=2

15 possible states, 5 with ni = 2, and 10 with ni, nj = 1

For M quanta in N oscillators there are:

1 !

!( 1)!

N M

M N

possible microstates

Let Ω(E, N) be be the number of microstates with energy E. PEAPP says that in an ensemble with N =

500 all of the 10299 microstates occur equally as often.

N M Ω(E, N)

5 1 5

5 5 126

5 50 ~3x105

50 50 ~5x1028

500 500 ~10299

Density of States

So far the total energy has been Mℏω for M as an integer or half integer

But more often spacing is not constant…

Smooth out distribution by saying the number of states with

energy between E and E + δE is ˆ ( )E E

Page 3: Section 3 - Statistical Mechanics

δE is an element of energy big enough that in a smooth function but small compared to the

precision of measurement of the energy E. We use Ω and ̂ interchangeably on this course, do not

worry.

1.4 Partitions

Partitions are a way to relate the temperature of a system to its microstates. System of N QHO’s,

energy E. Divide (partition) the system into two parts. N1 + N2 = N

Part 1 has energy E1 and Part 2 has energy E2 = E – E1

Question: What is the most likely value of E1?

Count how many microstates of the whole system give energy E1, for part 1. PEAPP most likely value

is the one with the most microstates.

If part 1 has energy E1 then it has 1 1ˆ ( , )E N possible microstates.

Part 2 has energy E2 therefore it has 2 2ˆ ( , )E N possible microstates.

The whole system has 1 1 2 2 2 1( , ) ( , ) with total E N E N E E E

Maximise total over E1, or better maximise 1 1 2 2ln ln ( , ) ln ( , )total E N E N

1

1 1 2 2

1 2

ln 0

ln ( , ) ln ( , ) 0

total

d

dE

d dE N E N

dE dE

Both parts of the system have the same value of ln ( )

dE

dE

Very sharp point at E1=U1, the most likely value.

Page 4: Section 3 - Statistical Mechanics

For large systems, E1 is almost always very close to U1

Parts 1 and 2 are in thermal eqm: they must have the same temperature therefore we can define

temperature by:

1ln ( , )

B

dE N

k T dE or if

1: ln ( , )

B

dT E N

E k dE

Note: we never used the fact that we were dealing with harmonic oscillators. The system was made

of QHO’s. This final result is a general definition of temperature in statistical mechanics.

1.4b) Other Partitions

System with total energy E, total volume V and N particles in total.

What is the most likely value of V1? Need to know

( , , )E V N , number of microstates now depends on the

volume, V.

Repeating previous argument, at eqm use PEAPP and find

T1=T2 and 1 1 1 2 2 2

1 2

ln ( , , ) ln ( , , )E V N E V NV V

Define: ln ( , , )B

E V Nk T V

Most likely value of N1? Repeat

argument and define chemical

potential:

ln ( , , )B

E V Nk T N

As before; At mechanical eqm, V1 is almost always very close to the most likely value. At diffuse eqm

N1 is almost always very close to its most likely value.

Recall from thermodynamics: 1

; ; NV NU UV

S S p S

U T V T N T

Consistency of statistical mechanics with thermodynamics shows that:

( , , ) ln ( , , )BS U V N k U V N

Page 5: Section 3 - Statistical Mechanics

Boltzmann’s formula for the entropy depends only on counting microstates.

Thermodynamic Derivatives

Recall: 1

dU Tds pdV dN

pdS dU dV dN

T T T

Recognising that S(U,V,N) is a function of 3 variables so:

Also ( , , ) :

NV NU UV

NV SN SV

S S Sds dU dV dN

U V N

U U UU S V N dU dS dV dN

S V N

Chemical Potential

This is the intensive variable conjugate to the number of particles. It is defined by:

ln ( , , )B

dSk T E V N T

N dN

As with heat flowing from higher temperatures from lower temperatures, chemical potential flows

from high μ to low μ

Page 6: Section 3 - Statistical Mechanics

1.5 Open System

Recall PEAPP applies to closed systems with fixed energy E. More often we know the temperature of

a system but not its energy (this is an open system).

How likely are different microstates in such a system?

Microstate “m” occurs with probability proportional to m

B

E

k Te

where Em is the energy of the

microstate.

The energy of the whole system E.

If the small system has energy Em the bath must have energy E - Em

The number of microstates available to the bath is:

( ) ( )bath mG m E E

Using a Taylor Expansion:

ln ( ) ln ( ) ln ( ) ...

1ln ( )

ln ( ) ln ( )

( ) ( )m

B

bath m bath m bath

bath

B

mbath m bath

B

E

k T

bath

E E E E EE

EE k T

EE E E

k T

G m E e

Every microstate of the whole system is equally likely, So the probability of finding the small system

in microstate, m, is proportional to the number of microstates accessible to the bath, G(m):

1( ) ( )

m m

B B

E E

k T k TP m G m e z e

Z

Page 7: Section 3 - Statistical Mechanics

Z is called the partition function. It depends on the temperature and the microstates of the system.

Note: For the bath we only assumed that Ωbath exists so the result is valid for any bath. Boltzmann

distribution holds for any open system of temperature, T.

Ensemble Specifics

1. In a closed system all copies of the system have the same energy, E, therefore PEAPP applies

and all microstates are equally likely, this is known as a microcanonical ensemble.

2. Take an open system at temp, T (with a fixed number of particles). Each copy of the system

has a different energy in general therefore PEAPP does not apply, instead the microstate

probabilities are:

1m

B

E

k Te

Z

<<< The canonical system

3. An open system at temperature, T, exchanging particles with its environment (e.g. a droplet

of water) have a probability function of:

1( ) ( , )

( , )

m m n

B B

E N E N

k T k T

n

P m e T eT

( , )T is the grand partition function.

This is the grand canonical ensemble

Recall: ln ( , , )Bk T E V NN

Example

Calculate the average energy <E> of a system in the canonical ensemble.

2

1 1( )

ln ln

m

mB

m m

E

Ek T

m m m

m m m

E E

m

m m

B

E E P m E e E eZ Z

e E e

E Z k T ZT

1.6 Connection of Statistical Mechanics with Thermodynamics

We have already seen the link for entropy: lnBS k

Another important connection is the free energy (Helmholtz), F(T,V,N)

( , , ) ln ( , , )BF T V N k T Z T V N

We can connect thermodynamics to statistical mechanics using either of these relations, for large

enough systems these two relations are approximately equivalent.

Page 8: Section 3 - Statistical Mechanics

Combining what we know so far we can find:

( ) ln ( )B

m

S k P m P m This is the Gibbs definition of the entropy in open systems.

The Boltzmann-Gibbs definitions are equal in a microcanonical ensemble and equivalent for large

enough systems.

Laws of Thermodynamics

Zeroth Law: Equivalent to partition arguments, systems with equal values of ln ( )EE

are all at

eqm with each other

First Law: In thermodynamics, energy is conserved and can be changed by heat or work. In statistical

mechanics, heat changes the occurrences of microstates and work changes the microstates

themselves.

Second Law: Total entropy change is greater than 0 for spontaneous processes. The only exception

to this is for reversible processes in which entropy is equal to 0. In statistical mechanics this means

that systems move towards macrostates with more accessible microstates.

Third Law (NON EXAMINABLE): In thermodynamics it says that as temperature tend to “0” the

entropy approaches a constant. In statistical mechanics this would mean Ω(E)=0 for E<E0 where E0 is

the ground state energy.

Page 9: Section 3 - Statistical Mechanics

2. Some Examples and Practical Rules

What can we do with statistical mechanics?

For real systems we cannot usually count all the microstates or calculate the partition function of a

system exactly. Instead we consider “model systems” that we can do calculations with (e.g. an ideal

gas). We then compare our calculations with experiments and computer simulations to see if the

model system helps us to understand the real system.

2.1 Very Simple Model of a Rubber Band

We want to explain why rubber bands get stiffer on heating.

Take a chain of N links, each link points either left or right

Let the number of links pointing right be, N+, and pointing left, N-

Then it follows that L=a(N+-N-)

For a given L, there are !

! !

N

N N

microstates (N+ being the links going right from N)

Entropy is then given by:

!ln ln

! !

ln ln

B B

B

NS k k

N N

N N N Nk N

N N N N

The first law for the rubber band states that:

U

dU TdS FdL

dSF T

dL

To find dS

dL we are required to use expressions for N+ and N-.

Page 10: Section 3 - Statistical Mechanics

It follows that:

2

1 1 and

2 2

ln 1 ln 12

For

B

B

L LN N N N

a a

k T L LF

a Na Na

L Na

k T LF

a N

This agrees to Hooke’s Law, It is an intensive force acting upon the band and the band gets stiffer

with increasing temperature (although this model is very simplified)

Simple Magnetic Model (Paramagnet)

If we put atoms in magnetic fields, then unpaired electrons like to align their spins with the field:

Up spins have lower energy than down spins and are more likely for B>0, is the magnetic moment

of the electron.

Microstate: The state of each spin (total N spins)

Macrostate: The total energy or total magnetisation must be specified, ( )d uE B N N ,

( )u dM N N

Properties of the material can then be calculated by two routes, either by finding entropy and

proceeding as before or by calculation of the partition function.

Page 11: Section 3 - Statistical Mechanics

Calculating the Partition Function

If N=1 (one atom) there are two states, up and down.

1 2 3

1 2

1 2

1

States

...

All States

1

For N atoms

...

N

N

N

E B B

kT kT kT

E E E E

kTN

EE E

kT kT kT

E B E B E B

N

Z e e e

Z e

e e e

Z

The rule when combining systems, if Etotal = E1 + E2 + … + EN is that ZN = Z1 x Z2 x … x ZN

Now to calculate the free energy:

1 2

1 1

ln

ln( ... )

ln

B

B N

N N

B i i

i i

F k T Z

k T Z Z Z

k T Z F

For the paramagnet:

lnB B

kt ktBF Nk T e e

The average energy of 1 atom is given by:

1

States

1

( )

tanh

ME

kTM

B B

kt kt

B B

kt kt

E e EZ

Be B e

e e

BB

kt

For N atoms: U=N<E1> or Magnetisation M = N<M1>

Page 12: Section 3 - Statistical Mechanics

As a result we find:

At low temperatures (or large B), all spins are up, entropy is zero and energy is minimised. However

for high temperatures (or B~0) spins are half up and half down, energy is large and entropy is

maximised. (NkBln2)

Quantum Particle in a Box

Box has a volume of V = L x L x L

Quantum mechanics says that the values of the energy are:

22

( ) with m being the particle mass2

k= , , where , , are integers (non 0)x y z x y z

E k km

n n n n n nL

How many states have: 0Bk k k k ?

Let each state be a point in “k-space”

kz comes out of the screen…

The points form a grid, we can use this to approximate the amount of points within kb of the origin:

3

0

3

4 1 1

3 8

k

L

Page 13: Section 3 - Statistical Mechanics

To find out the points between k0 and δk:

2 32 00 3 2

1 14

8 2

k Lk k k

L

For entropy we want to find the points with energy between E & E + δE

In General:

( ) states between &

then

(E) E states between &

( ) ( )

g k k k k k

E E E

dkE g k

dE

In this case:

2 2 2

3 32

2 2 2 2

313

22

2 2

1 so and 2

2 2

( ) ( )2 2

Substituting in k

2(E)=

4

k dE kE k mE

m dk

dk L m mLE g k k k

dE k

L mE

Partition Function

1

2

3

23

1 2

( )

let

2

mE E

kT kT

states

B

Z e dE e e

x E

mk TZ L

Free Energy

2

2

3 2ln ln

2b B

B

F k T Z k Tmk TL

Average Energy

2 3ln

2B BE k T Z k T

T

Page 14: Section 3 - Statistical Mechanics

Pressure

B

T

dFp k T

dV

A Note about Degeneracy

If more than one microstate in a system has the same energy we say that the energy level is

degenerate, in fact if there are g microstates then the level has a degeneracy, g. In general we can

check with:

allowedenergies

where g is degeneracy of level ii

i

E

kTi i

E

Z g e

Also for electrons, microstates come in pairs, the result of this is that the density of states gets

multiplied by 2 as long as the up and down spin states have the same energy.

Page 15: Section 3 - Statistical Mechanics

3. Ideal Quantum Systems with More Than One Particle

Realistic systems (solid, liquid, gas etc.) consist of many quantum particles (atoms, molecules etc.).

We will consider some examples of such systems, showing how to count microstates and how we

may assume a classical description/

3.1 Identical Particles Compared to Distinguishable Particles

Assume, 1, All particles are identical, 2, the energy of the system is the sum of the energies of each

individual particle. To solve the problem, solve the Schrödinger equation for just one particle, find

the energy levels, Ei, and their degeneracies, gi.

For many (N) particles we define the microstates of the N-particle system in terms of the 1 particle

system. For distinguishable particles, a microstate, M, might be: particle 1 in state m1, particle 2 in

state m2 … particle N in state mN.

mi refers to the microstate of the 1 particle system.

But for identical/indistinguishable particles a microstate, M, must be defined as (e.g.): state m1 has 1

particle, state m2 has no particles etc. defined for all m states.

When we apply PEAPP, there are two ways of defining the microstates that lead to different

conclusions. This means we MUST choose the correct definition for the situation (in the majority of

cases this will be the identical particle definition.)

3.2 Fermions and Bosons

Quantum particles are either fermions or bosons. For fermions, each state, m, contains at most one

particle (due to the Pauli Exclusion Principle). For bosons, each state, m, may contain any number of

particles. Usually in these systems it is convenient to use the grand canonical state.

If M is a microstate of the many particle system with NM particles and total energy EM then the

probability of finding this microstate is:

1

1( )

M MB

E Nk T

P M e

where 1

M M

M B

E Nk T

Note: and M m M m mN n E n E with nm being the number of particles in state m.

Page 16: Section 3 - Statistical Mechanics

3.3 Fermionic Systems

Consider just one state, m, of the 1-particle system, in the N-particle system, state m may contain 0

or 1 fermions.

For this state:

1

1m

B

Ek T

m e

The probability that state m contains 1 fermion is then:

11

1 1

1 1( 1) 1

1

( )

mBm

B

m mB B

EK TE

K T

mE E

k T k T

FD m

eP n e

e e

n E

NFD(Em) is called the Fermi-Dirac distribution function – arranging n fermions in a non-degenerate

energy level with energy E.

Note: Fermions have ‘spin’, often the up and down states have the same energy in which case g = 2,

two fermions may have the same energy if they have different spins, or if there is some other

degeneracy in the system.

Luckily for ideal systems, the grand partition function is the same as the product of all the

microstates of the 1-particle system. This means that the state occupancies are given by nFD(E) even

if we have many microstates of the 1-particle system. For ideal systems each state can be treated

separately.

Page 17: Section 3 - Statistical Mechanics

3.4 Bosonic States

If state m contains n bosons then:

1

0

mB

n Ek T

m

n

e

This formula is a geometric progression (See any C1 book from A-level maths):

1

1

1m

B

Ek T

e

The average number of particles in state m is:

1 mB

nE

k T

n

ne

Notice that this is the same as:

logBk T

Taking the derivative:

1( )

1( )

1m

B

BEE

k T

n n E

e

With nBE(E) being the Bose-Einstein distribution function, giving the average number of particles in a

bosonic energy level.

Page 18: Section 3 - Statistical Mechanics

3.5 Combining nx(E) with Density of State (E)

Instead of one state m, consider many states, with density of state (E)

We have the average number of particles, and average energy in the whole system given by:

( ) ( )xN dE E n E , ( ) ( )xE dE E En E

There is an exception to this, bosons at low temperature.

In section 2.3 we showed that for an ideal gas:

31

22

2 2

2( )

4

V mE E

(neglecting spin, if it’s an electron in a box then a factor of 2 is introduced)

So we can plug this into the formulae for <N> and <E> to get the number of particles as a function of

(m, V, T, μ) and hence average the energy as a function of (m, V, T, μ or density ρ)

3.6 Examples of gases of Fermion and Bosons

1. The Classical Limit: If Em – μ >> kBT then:

( )m

B

E

k T

xn E e

for both fermions and bosons, this is known as the classical limit.

For an ideal gas this happens when the density, Q

Nn

V

nQ is the quantum concentration:

3

2

22

BQ

mk Tn

The estimate of the quantum uncertainty in particles positions, λ, can be written as:

12 22

Bmk T

In the classical limit we can easily work with a fixed number of particles, N, but for identical

(indistinguishable) particles the partition for N particles is:

1

1 N

NZ ZN

Where Z1 = nQV, the partition δn of 1-particle in a box of volume V.

Page 19: Section 3 - Statistical Mechanics

2. Blackbody radiation (an ideal gas of photons)

Consider photons in a box, the microstates are the same as those of a quantum particle in a box.

States are labelled by a wave-vector:

( , , )x y zk n n nL

The number of states with k between k0 and k0 + δk is g(k)δk, with:

2 3

2( )

2

k Lg k

n

For photons, E and ck , We want the density of states:

( )( ) ( )

dk g kE g k

dE c

Each microstate has degeneracy, Z1, to take care of polarisation so:

2 3

2 3( )

( )

E LE

n c , this is known as the photon density of states

Photons are bosons. They can be created and destroyed spontaneously, this means that μ = 0

So the total energy of the system is:

0

3

3

2 3

( ) ( )

( )

( )

1B

BE

k T

E dE E En E

E

dE d

E L d U

U

c e

This is the Planck Distribution

Page 20: Section 3 - Statistical Mechanics

3.S Summary

1. Take care counting microstates if you have identical quantum particles, remember that

“particle A in state a, particle B in state b” is not a different state to “particle B in state a,

particle A in state b”

2. The average number of particles in an energy level with energy E is:

1( )

1 B

FD E

k T

n E

e

for fermions and 1

( )

1 B

BE E

k T

n E

e

for bosons.

3. The classical limit at high temperatures: ( ) ( ) 1B

E

k T

FD BEn E n E e

The equivalent statement for an ideal gas is the number density: Q

Nn

V

3

2

22

BQ

mk Tn

Another equivalent statement is that the quantum uncertainty in particle positions <<

particle spacing.

Page 21: Section 3 - Statistical Mechanics

4. Statistical Mechanics in Practice

For the purposes of this course, there are three important kinds of statistical mechanics calculation

you need to be able to do. We will discuss these three methods in this section.

4.1 Microcanonical

Route 1: When you know the value of the energy of a system, but not it’s temperature the usual

method is:

1. Calculate the number of microstates that are available (Ω(E)) Where E is the energy.

2. Calculate the entropy through logBS k

3. Use derivatives to calculate physical quantities (see below)

**DIAGRAM**

PEAPP states all states with energy E are equally likely in a closed system.

Examples (PS1)

The route can be modified. In PS1-Q4 we counted the number of states for a crystal with m

vacancies. In the second section, we considered a number of states of a rubber band with length L.

Generally we calculate entropy by counting states for these types of questions.

Derivatives of the Entropy

For a system of N particles in a volume V, with energy E, the number of states Ω and the entropy S

depend on (E, V, N). As discussed in the previous sections, we can obtain the temperature T,

pressure p and chemical potential μ using:

, , ,

1, ,

V N E N E V

S S Sp T T

T E V N

So if you know Ω as a function of (E, V, N) you can calculate (T, p, μ) by taking derivatives of S.

You can remember the above formulae from:

1

dU TdS pdV dN

pdS dU dV dN

T T N

Page 22: Section 3 - Statistical Mechanics

4.2 Boltzmann Distributions

Route 2: When you know the temperature of a system but not its energy you can calculate the

probability of finding the system in a particular microstate.

If the energy of the microstate is Em then this probability is:

All States m

1( ) where

m m

B B

E E

k T k TP m e Z e

Z

If there are gi microstates with energy Ei then it follows that the probability that the system has

energy Ei is:

All Energies E

1( ) where

i i

B B

i

E E

k T k T

i i iP E g e Z g eZ

Example (See PS2-Q2)

Calculate P(m2)/P(m1) where m2 state has energy E2 and m1 has E1.

2

1 2

1

2

1

1( )

( 0.33 for PS2-Q2)( ) 1

B

B

B

E

k TE E

k T

E

k T

eP m Z eP m

eZ

In this case there are two states with energy E1 and eight states with energy E2. If m1 is one of the

states with energy E1 and m2 one of the states with E2 then P(m2)/P(m1)=0.33, but the probability of

having energy P(E2) = 8P(m2) So:

2

1 2

1

2

1

8( )

4 1.2( ) 2

B

B

B

E

k TE E

k T

E

k T

eP E Z eP E

eZ

Page 23: Section 3 - Statistical Mechanics

4.3 Free Energies

Route 3: When you know the temperature of a system but not its energy, you can calculate the free

energy from the partition function Z. This lets you derive the macroscopic quantities as in route one,

the method is:

1. Write down the partition function as a sum

2. Calculate the sum and then use logBF k T Z to calculate the free energy

3. Use derivatives of F or log Z to calculate pressure, energy, etc. (see below)

Examples

PS2-Q4 and PS3-Q1, also paramagnet, and particle in a box examples from previous sections.

Derivatives of F and log Z

WE use derivatives of F in a similar way to the derivatives of S.

For a system of N particles in a volume V, at temperature T, the partition function Z and the free

energy F depend on (N, V, T). As discussed in the lectures, we obtain the entropy S, pressure p and

chemical potential μ using:

, , ,

2

,

, p ,

Also

log

V N T N V T

B

N V

F F FS

T V N

ZU E k T

T

By knowing Z we can calculate S, and thus all the other physical quantities

Page 24: Section 3 - Statistical Mechanics

4.S Other Helpful Hints

In addition to familiarity with the types of calculation given here, the most important features of

statistical mechanics that come up are:

Definitions: microstates, macrostates, PEAPP, density of states.

How to count microstates in systems with many identical (indistinguishable) particles

Physical interpretations of Fermi-Dirac and Bose-Einstein distribution functions nFD(E) and

nBE(E).

Properties of ideal gases of classical particles, fermions, photons and other bosons.

Extra Information on Counting Microstates of Identical Particles (PS4-Q2)

1 Particle: 0 1

2B B

E E

k T k TZ e e

2 Identical Fermions (same spin): 0 1 12

2 B B

E E E

k T k TZ e e

Probability of being in state with 1 fermion in E1 and 1 in E2:

0 1

( )B

E E

k TeP m

Z

Page 25: Section 3 - Statistical Mechanics

2 Identical Bosons: All States

a b

B

E E

k TZ e