Statistical Lecture1 2

28
7/23/2019 Statistical Lecture1 2 http://slidepdf.com/reader/full/statistical-lecture1-2 1/28 Scope of statistical mechanics Review of Classical Thermodynamics Equilibrium The Ensemble Concept (Heuristic) Ensembles microcanonical ensemble canonical ensemble grand canonical ensemble Boltzmann Distribution nomenclature – sums and products, Stirling’s approximation Distributions and arrangements in an ensembles The most probable distribution in the canonical ensemble Recommended Texts: Atkins de Paula, “Atkins' Physical Chemistry”, 10 th  edition, Oxford. Maczek, 'Statistical Thermodynamics', Oxford Chemistry Primers. Widom, 'Statistical Mechanics- A concise introduction for chemists', Cambridge University Press 2 5 7 10 12 13 14 15 16 17 20 ! # $%&'()%* UCL DEPARTMENT OF CHEMISTRY CHEM2301/2304: Physical chemistry Module3: Statistical Mechanics Professor Francesco Luigi Gervasio

Transcript of Statistical Lecture1 2

Page 1: Statistical Lecture1 2

7/23/2019 Statistical Lecture1 2

http://slidepdf.com/reader/full/statistical-lecture1-2 1/28

Scope of statistical mechanics

Review of Classical Thermodynamics

Equilibrium

The Ensemble Concept (Heuristic)

Ensemblesmicrocanonical ensemblecanonical ensemble

grand canonical ensemble

Boltzmann Distribution

nomenclature – sums and products, Stirling’s approximation

Distributions and arrangements in an ensembles

The most probable distribution in the canonical ensemble

Recommended Texts:

Atkins de Paula, “Atkins' Physical Chemistry”, 10th edition, Oxford.

Maczek, 'Statistical Thermodynamics', Oxford Chemistry Primers.

Widom, 'Statistical Mechanics- A concise introduction for chemists', Cambridge University Press

2

5

7

10

12

1314

15

16

17

20

!

# $%&'()%*

UCL DEPARTMENT OF CHEMISTRYCHEM2301/2304: Physical chemistryModule3: Statistical MechanicsProfessor Francesco Luigi Gervasio

Page 2: Statistical Lecture1 2

7/23/2019 Statistical Lecture1 2

http://slidepdf.com/reader/full/statistical-lecture1-2 2/28

1. Scope of  Statistical Mechanics 

Statistical mechanics is a complete subject. It spans from Classical

Thermodynamics, through Spectroscopy to Quantum Mechanics. And it provides the ability to calculate equilibrium constants from spectroscopicdata using classical thermodynamics.

While the laws of thermodynamics are broad empirical generalizationswhich allow correlations to be made among the properties of macroscopicsystems (1020 or more atoms) avoiding any reference to the underlying

microscopic structure of matter,

Statistical Mechanics relates the properties of macroscopic systems to theirmicroscopic constitution.

Statistical Thermodynamics (a branch of SM) is devoted to calculating thethermodynamic functions of a system of given composition when the

interactions between system’s component are known.

+

Page 3: Statistical Lecture1 2

7/23/2019 Statistical Lecture1 2

http://slidepdf.com/reader/full/statistical-lecture1-2 3/28

1. Scope of  Statistical Mechanics (2) 

Classical

small systems  precisely known   predictions 

Initial State Final State

(large) systems  reasonable incomplete knowledge   predictions 

StatisticalMechanics

involve probabilities 

or

Quantum 

eigen values – exact 

expectation values precise

 

! x" = #$ X  $ d  %

 

statistical basis 

,

Page 4: Statistical Lecture1 2

7/23/2019 Statistical Lecture1 2

http://slidepdf.com/reader/full/statistical-lecture1-2 4/28

As N increases we might expect complexities and intricacies of the properties to increase.

But New regularities appear because there are many degrees of freedomand new statistical laws apply. 

Thermodynamics provides connections between many properties butnothing on the properties of any one. Atoms and molecules are notrequired and many systems are too complicated to be characterized.

-

Page 5: Statistical Lecture1 2

7/23/2019 Statistical Lecture1 2

http://slidepdf.com/reader/full/statistical-lecture1-2 5/28

2- Review of Classical Thermodynamics

The law of thermodynamics consists of two parts:

(a) the equalities, for example, changes in internal energy U

dU   =TdS ! pdV 

This is the “fundamental equation”From a combination of the first law of thermodynamics:

dU =!

q +!

w The change in internal energy is given by the heat absorbedby a system plus the work done on the system

!wrev = -pdV (expansion work)!

qrev = TdS (from the definition of Entropy)dU is an EXACT Differential(its value is independent of the path)

.

Page 6: Statistical Lecture1 2

7/23/2019 Statistical Lecture1 2

http://slidepdf.com/reader/full/statistical-lecture1-2 6/28

closed 

1. Time is denoted by t and is treated like any other variable when you differentiate.

2. The superscript & means the whole system consisting of phases ', (, ))) so & = ' + ( + ))) 

(b) the inequalities for the  system & 

This is the fundamental inequality of the second law of thermodynamics.

!S "

!t 

#

$%

&

'(V 

",U 

", N 

" ) 0

 N is the total amount irrespective of species 3. ' + ( +…

 N = ( nA + nB + …)&

/

the entropy S increases over time. 

Page 7: Statistical Lecture1 2

7/23/2019 Statistical Lecture1 2

http://slidepdf.com/reader/full/statistical-lecture1-2 7/28

3- Equilibrium

All the conditions for equilibrium and stability can be derived from the

fundamental inequality

S increases over time. !S "

!t #$% &

'(V 

",U 

", N 

"

) 0

Entropy is a thermodynamic quantity representing the unavailability of a

system's thermal energy for conversion into mechanical work. It is often

erroneously referred to as the 'state of disorder' of a system. Qualitatively,entropy is a measure of how evenly energy is distributed in a system.

We will soon establish its connection to W, the number of configurations (or

microstates) corresponding to the macrostate, through the Boltzmann equation:

S = k ln W (please note that sometimes * is used instead of W)

0

Page 8: Statistical Lecture1 2

7/23/2019 Statistical Lecture1 2

http://slidepdf.com/reader/full/statistical-lecture1-2 8/28

The variables held constant (V,U,N) in the fundamental inequality are experimentally

inconvenient. Usually it is possible to keep the temperature, the pressure and/or the

volume constant. Thus it is easier to use one of the other thermodynamic inequalities

(that can be derived from the fundamental one using the properties of partial

derivatives):

H decreases spontaneously as a function of time: it has a minimum 

H(S,p) = U + pV;

! H "

!t #$% &

'(S , p, N 

 ! 0

#

Enthalpy,Enthalpy, is the preferred expression of system energy changes in many chemical,

 biological, and physical measurements at constant pressure, because it simplifies thedescription of energy transfer. At constant pressure and entropy, the enthalpy changeis equal to:

Using the fundamental inequality and the properties of partial derivatives it can beshown that:

"H = the heat flow into the system

Page 9: Statistical Lecture1 2

7/23/2019 Statistical Lecture1 2

http://slidepdf.com/reader/full/statistical-lecture1-2 9/28

G and A spontaneously decrease as a function of time: they have aminimum 

A(T,V) = U - TS

! A

"

!t 

#

$%

&

'(T ,V , N 

!G"

!t 

#

$%

&

'(T , p, N 

 ! 0 ! 0

1

Gibbs and Helmholtz free energy,

The Gibbs “free energy” is a thermodynamic potential that measures themaximum or reversible work that may be performed by a thermodynamicsystem at a constant temperature and pressure(the most common experimentalconditions).

G(p,T) = H – TS ; dG = -SdT + Vdp + + BµBdnB 

The Helmholtz free energy is a thermodynamic potential that measures the

maximum or reversible work that may be performed by a thermodynamicsystem at a constant temperature and volume.

Using the fundamental inequality, it can be shown that

dA = -SdT -  pdV + + BµBdnB 

Page 10: Statistical Lecture1 2

7/23/2019 Statistical Lecture1 2

http://slidepdf.com/reader/full/statistical-lecture1-2 10/28

4. The Ensemble Concept (Heuristic definition)

For a typical macroscopic system, the total number of particles N~ 1023.

One way to calculate the properties of a system is to follow its evolution

over time then take an average. For example, if we measure the pressure of1dm3 of a gas with a manometer and it takes 10 s. From simple kinetic

theory the collision density for gaseous N2 at ambient conditions is about

 Z 11 = 1035 m –3 s –1. So the pressure measurement averages

( 35 -3 -1)(   -3 3)( ) 33=  10 m s 10 m 10 s = 10 molecular events Z 11 

How long would it take to calculate the pressure on this basis? 

With a 1 GHz computer it would take roughly

1027  1019

 s = 3 , years 

Compare this with:

the age of the earth time since the big bang

4 billion years 

(13.75 ± 0.17) billion years 

Such an approach is not practical.

!2

Page 11: Statistical Lecture1 2

7/23/2019 Statistical Lecture1 2

http://slidepdf.com/reader/full/statistical-lecture1-2 11/28

Alternatively, surround the system with N replica systems

and impose thermodynamic constraints so all systems haveidentical macroscopic properties but allow different microscopic

 properties. We say we have an ensemble of systems.

Average over the systems in the ensemble N  

+  X  1 

 X =  j  N   j=1 

Provided N is sufficiently large, then  X is the

“experimental” value of X for the system. 

A large number of observations made on a single system at

 N arbitrary instants of time have the same statistical properties as observing N arbitrarily chosen systems at the same time from

an ensemble of similar systems.

we have replaced a time average (many observations on a single system)

 by a number average (single observation of many systems) This approachwas developed independently by Gibbs and by Einstein. 

Since from a macroscopicPoint of view precisemicroscopic details arelargely unimportant, we usethe ensemble concept to“wash out” the microscopicdetails. 

!!

Page 12: Statistical Lecture1 2

7/23/2019 Statistical Lecture1 2

http://slidepdf.com/reader/full/statistical-lecture1-2 12/28

5.1 Ensembles There are several different type of ensembles depending on which

thermodynamic variables are common to the systems in the ensemble.

5.1(a) Microcanonical Ensemble  common N, V,  U 

Walls: impermeableRigid

Adiabatic

no exchange of  no exchange of

no exchange of  

 N  V

U  

Each system is isolated 

Common densities:Common fields:

V / N , U / N  none 

The independent variables are N, V, U . So the MicrocanonicalEnsemble corresponds to the fundamental surface U(S, V, N)

 N,  N, 

 N,  N, U, V U, V 

U, V  U, V 

!+

Page 13: Statistical Lecture1 2

7/23/2019 Statistical Lecture1 2

http://slidepdf.com/reader/full/statistical-lecture1-2 13/28

no exchange of N

It is often difficult to work with the microcanonical

ensemble because, for example, to findthe variation in S as U changes. 

T we must study

5.1(b) Canonical Ensemble Walls: impermeable

common  N, V,  T  

rigid no exchange of V  

diathermic exchange of U to 

achieve uniform T  

Each system is closed 

Ensemble is thermally isolated by adiabatic wall 

 N, N, 

 N, N, 

T, V T, V 

T, V 

T, V 

!,

The independent variables are N, V, and T so the canonicalensemble corresponds to

the Helmholtz surface A(T, V, N ).

We will soon find out its link to the canonical partitionfunction  A = – kT ln Q 

Page 14: Statistical Lecture1 2

7/23/2019 Statistical Lecture1 2

http://slidepdf.com/reader/full/statistical-lecture1-2 14/28

common µ, 5.1(c) Grand Canonical Ensemble  V,  T 

Walls: permeable exchange of N to 

achieve uniform µno exchange of V  rigid

diathermic exchange E to achieve uniform T  

Each system is open 

Ensemble is surrounded by a wall that is adiabatic and impermeable. 

Hence the ensemble is thermally isolated and closed with respect to the surroundings. 

µ , V, µ , V, 

µ , V, µ , V, 

T T 

T  T 

!-

Page 15: Statistical Lecture1 2

7/23/2019 Statistical Lecture1 2

http://slidepdf.com/reader/full/statistical-lecture1-2 15/28

6. The Boltzmann distribution

The 'Boltzmann' distribution is among the most important equations in chemistry. It is used to

 predict the population of states in systems at thermal equilibrium and provides an insight into

the nature of 'temperature'.

We will adopt the principle of "equal a priori probabilities", i.e. the assumption that the

system is equally likely to be found in any of its accessible states with a given energy(e.g. all those with the same energy and composition)*.

We note that the Schroedinger equation H# =E# can be applied to both isolated molecules

and to macroscopic systems. In the latter case we refer to the accessible states of the

macroscopic systems ! j as complexions of that system. 

One very important conclusion is that the overwhelmingly most probable populations of the

available states depend on a single parameters, the temperature. As we will see in the

following, for the population of quantized energy states it takes the form:

!.

ni

n j 

= e!(! i!

!  j  )/kT    mi

m j 

= e!( E i! E  j  )/kT  Or, for twomacroscopic systems

mi and m j withenergies Ei and Ej

*We have no reason to assume for a collection of molecules at thermal equilibrium that a vibrationalstate of a certain energy is not equally probable than a rotational state of the same energy

Page 16: Statistical Lecture1 2

7/23/2019 Statistical Lecture1 2

http://slidepdf.com/reader/full/statistical-lecture1-2 16/28

6.1 Nomenclature: sums and products You are familiar with the notation for summation

k  

+i=1 xi  =  x1 +  x2 + …+  xk  

+ i xi or +  x when there is no which is simplified to

ambiguity. By analogy, for a product we writek  

- i=1 xi  =  x1 ,  x2 ,…,  xk  

or just -  x . -i xi which often can be simplified to

 Note in particular that

ln(-  x) = ln(  x1 x2 … xk = ln )  x1 + ln x2 + …+ ln xk  

= + ln x 

Stirling’s approximation for large factorials 

When x is large

ln x! .  x ln x -  x 

For chemical systems, the number of particles is so large

that this approximation is essentially exact.

!/

Page 17: Statistical Lecture1 2

7/23/2019 Statistical Lecture1 2

http://slidepdf.com/reader/full/statistical-lecture1-2 17/28

6.2 Distributions and Arrangements in an Ensemble

The systems in the ensemble have the same macroscopic properties but the molecular

details are different. Suppose the ensemble has M systems in total, of these m1 systems have one arrangement of molecular states, with energy E1

m2 systems have a second arrangement of molecular  states with energy E2 and so on

We can group the systems by their molecular states to give a distribution of m1, m2, ))) systems.

The number of arrangements of the ensemble with thisdistribution is

 M !

where 

+W 

= M

= m 

+ m

+ …

=m 

1  2   j  j m1!m2 !... 

Combinatory rule:n1 particles in Box 1n2 particles in Box 2

Total of N = n1+n2 particles

We can arrange these particles inW2 = N!/(n1!n2!)non-equivalent arrangements

!0

Page 18: Statistical Lecture1 2

7/23/2019 Statistical Lecture1 2

http://slidepdf.com/reader/full/statistical-lecture1-2 18/28

From Atkins' Physical Chemistry 10e Chapter 15.

Example: the possible

arrangements of a 3,2

configuration

!#

Page 19: Statistical Lecture1 2

7/23/2019 Statistical Lecture1 2

http://slidepdf.com/reader/full/statistical-lecture1-2 19/28

Example 2: the possible

arrangements of 4 coins

HHHH 4!/4! = 1

HTTT

THTT 4!/3!1!= 4

TTHT

TTTH

HHTT

TTHH

THTH 4!/(2!2!) = 6

HTHT

THHT

HTTH

Page 20: Statistical Lecture1 2

7/23/2019 Statistical Lecture1 2

http://slidepdf.com/reader/full/statistical-lecture1-2 20/28

Typically we are working with large numbers ( M . 1023) so it is convenient to work with ln W rather than W  

lnW = ln M !- ln(- m!) = ln M !- ln(m1!) m2 !...) 

= ln M !- + ln m! 

=  M ln M -  M - {+ m ln m - + m}

=  M ln M - +  j m j ln m j 

where we have used Stirling’s approximationfor the factorials. 

Stirling approximation:ln(n!)= nln(n)-nValid for large values of n

/ /

!1

lnW  !  M ln M "   m j  lnm j  j 

#   3!4

6 3 The most probable distribution in the canonical ensemble

Page 21: Statistical Lecture1 2

7/23/2019 Statistical Lecture1 2

http://slidepdf.com/reader/full/statistical-lecture1-2 21/28

 Nature of the system considered.

1. The thermodynamic system is an assembly of M systems in a state defined bythe volume V. The total internal energy of the assembly is Utot 

2. The energies of the individual systems are eigenvalues of the Schoedinger eq.

3. There is no restriction in which energies can be allocated to the individualsystems (i.e. any number of systems can have any particular energy level)

4. All complexion associated with a given E and V are equally likely +2

6.3 The most probable distribution in the canonical ensemble We want to find the most probable distribution, where W will be a maximum, for

an ensemble where m1 systems have energy E1, m2 systems have energy E2 and

so on.

But before doing so, we ask the question, how dominant is the most probabledistribution?We know from experience that in tossing 100 coins successively manytimes, the 50-50 configuration for head or tails will stand above the rest.What would happen with a mole of coins? The distribution ofconfigurations peaks so sharply, that no other configuration gets so muchas look in!This is also true for the statistical weight of the most probabledistribution Wmax. If we compare Wmax for a mole of particles to the

statistical weight of a distrivution that differs as little as 1 part in 1010

 we find that Wmax is more than 10434 times more probable! 

Conservation of number and Energy – these are rather

Page 22: Statistical Lecture1 2

7/23/2019 Statistical Lecture1 2

http://slidepdf.com/reader/full/statistical-lecture1-2 22/28

The appropriate constraints for the canonical ensemble are:

!  The total number of systems M is constant

 M - 

+  jm j = 0 

the entire ensemble is surrounded by an adiabatic wall! 

so the total energy is constant

- +imi E i = 0 U tot 

Lagrange’s method of undetermined multipliers allows us

to find the distribution that maximises ln W subject to theconstraints of constant M and constant U tot. 

Conservation of number and Energy – these are ratherobvious constraints that we need to place on our system.

I am using "Utot" (insteadof Etot) for consistencywith previous notation.

Utot= &miEi 

+!

Page 23: Statistical Lecture1 2

7/23/2019 Statistical Lecture1 2

http://slidepdf.com/reader/full/statistical-lecture1-2 23/28

The method of agrange multipliers

is used to find the local maxima and minimaof a function subject to equality constraints.

For instance if we want tomaximize f(x, y)subject to g(x, y) = c.

We introduce a new variable (λ) called aLagrange multiplier and study the Lagrangefunction (or Lagrangian) defined by:

Since we are imposing that our constraints "c"are 0, our F takes the form:

F x, y,! ( ) =   f  ( x, y)+! (g( x,  y)! c)

+!

F x, y,! ( ) =   f  ( x, y)+! (g( x, y))

Form a new function F by adding to ln W each constraint, multiplied by a constant

Page 24: Statistical Lecture1 2

7/23/2019 Statistical Lecture1 2

http://slidepdf.com/reader/full/statistical-lecture1-2 24/28

 Now take the derivative with respect to the populations m j to obtainthe conditions that are necessary to make W(or lnW ) a maximum.

{lnW + '   M-  j j)}/ F   / (  j)+ +m + (  U Tot  = 0 =   - m E   j   j /m j  /m j 

The algebra turns out to be easier than it looks for threereasons.

(i)  E tot and M are constant so and

(ii) m j and E  j only occur once in the sums so

and

and

(iii) !  and  "   are constant

(

 F = lnW + ' ( M - +  j m j) + ((U tot   j  j j)m E  - +y g , p y

+,

Page 25: Statistical Lecture1 2

7/23/2019 Statistical Lecture1 2

http://slidepdf.com/reader/full/statistical-lecture1-2 25/28

 / F / lnW   /  (  j)  /  (   j j)+ ++ ' m + (= 0 =  M - U tot   --

m E   j   j /m j  /m j  /m j  /m j 

+-

3!4

lnW  !  M ln M "   m j  lnm j 

 j #3!4

=

!

!m j 

 M ln M "   m j  lnm

 j  j 

#( )

3+4 5 6* &78*'98' 98:

36'* :%)6;9<;% 6* =24

 

3+4

3,4

3,4

Page 26: Statistical Lecture1 2

7/23/2019 Statistical Lecture1 2

http://slidepdf.com/reader/full/statistical-lecture1-2 26/28

= - ln m  j -1- ' - (  E   j= 0 

Hence ln m j  = -1- ' - (  E j 

+-

=

!

!m j 

 M ln M "   m j  lnm j  j 

#( )

m j   =   e

!1!! ( )e!!  E  j 

ll h

Page 27: Statistical Lecture1 2

7/23/2019 Statistical Lecture1 2

http://slidepdf.com/reader/full/statistical-lecture1-2 27/28

 Now sum over all the systems

e-1-'Divide one equation by the other (the cancels) to obtain

is clearly the probability Pj that the system has energy

 Ej and it can be shown that (but we won’t do it here)

( = 1 kT   and T is uniform in the canonical ensemble 

+ exp(-(  E j ) is the canonical partition function Q and is the key quantity

for the canonical ensemble. 

+.

m!  j =   e

"1"! ( )e"!  E  j !

 M   =   e!1!! 

( )  e

!!  E  j 

"

m j 

 M =

e!!  E  j 

e!!  E  j "

m j 

 M 

Page 28: Statistical Lecture1 2

7/23/2019 Statistical Lecture1 2

http://slidepdf.com/reader/full/statistical-lecture1-2 28/28

With the canonical partition function defined by

exp(-  E j kT) j +Q =

we can write

exp(-  E j kT)m j= = P  j 

 M   Q 

In the next lecture we will connect Q to thermodynamic variables

+/