67
Entropy, probability and disorder

Entropy, probability and disorder

Embed Size (px)

DESCRIPTION

Entropy, probability and disorder. Thermal equilibrium. Experience tells us: two objects in thermal contact will attain the same temperature and keep that temperature Why? More than just energy conservation! Involves concept of entropy. Entropy and disorder. - PowerPoint PPT Presentation

Citation preview

Page 1: Entropy, probability and disorder

Entropy, probability and disorder

Page 2: Entropy, probability and disorder

Thermal equilibrium

Experience tells us: two objects in thermal contact will attain the same temperature and keep that temperature

Why? More than just energy conservation!

Involves concept of entropy

Page 3: Entropy, probability and disorder

Entropy and disorder

It is often said that entropy is a measure of disorder, and hence every system in isolation evolves to the state with “most disorder”

Consider a box sliding on a floor: internal energy due to disorderly motion of the

molecules kinetic energy (of the box) due to the collective,

orderly, motion of all the molecules

Page 4: Entropy, probability and disorder

Entropy and disorder II

Now the box comes to rest due to friction

Temperature rise in both floor and box so the internal energy increases

No more collective motion: all K.E. has been transferred into internal energy

More disorder, so entropy has increased

Page 5: Entropy, probability and disorder

A vessel of two halves

Large number of identical molecules – distribution?

About 50% in left half, 50% in right halfWhy?

Page 6: Entropy, probability and disorder

Definitions

Microstate: position and momentum of each molecule accurately specified

Macrostate: only overall features specified

Multiplicity: the number of microstates corresponding to the same macrostate

Page 7: Entropy, probability and disorder

Fundamental assumption

Statistical Mechanics is built around this one central assumption:

Every microstate is equally likely to occur

This is just like throwing dice:

Page 8: Entropy, probability and disorder

A throw of the diceRoll one die: 1/2/3/4/5/6 all equally likely

Roll a pair of dice: for each 1/2/3/4/5/6 equally likely the sum 7 is most likely, then 6 and 8, etc.

Why? 6 combinations (microstates) give 7 (the macrostate): 1+6, 2+5, 3+4, 4+3, 5+2, 6+1. There are 5 combinations that give 6 or 8, etc.

Page 9: Entropy, probability and disorder

Four identical molecules

4 molecules ABCD5 macrostates:

Page 10: Entropy, probability and disorder

Four identical molecules (2)

left: right: A&B&C&D -

A&B&C D A&B&D C A&C&D B B&C&D A

multiplicity: 1

multiplicity: 4

Page 11: Entropy, probability and disorder

Four identical molecules (3)

left: right: A&B C&D A&C B&D A&D B&C B&C A&D B&D A&C C&D A&B

multiplicity: 6

Page 12: Entropy, probability and disorder

Four identical molecules (4)

#left #right multiplicity probability 4 0 1 1/16 3 1 4 4/16 2 2 6 6/16 1 3 4 4/16 0 4 1 1/16

16

Page 13: Entropy, probability and disorder

Ten identical molecules

Multiplicity to find 10 –...– 0 molecules on left 1–10–45–120–210–252–210–120–45–10–1

Probability of finding #left = 4, 5 or 6:

For large N: extremely likely that #left is very close to N/2

66.01024

210252210

Page 14: Entropy, probability and disorder

Generalisation

Look at a gas of N molecules in a vessel with two “halves”.

The total number of microstates is 2N: two possible locations for each molecule we’ve just seen the N=4 example

Page 15: Entropy, probability and disorder

Binomial distribution I

A gas contains N molecules, N1 in the left half (“state 1”) and N2 = N – N1 in state 2 (the right half). How many microstates correspond to this situation?

N1 N2

Page 16: Entropy, probability and disorder

Binomial distribution II

Pick the molecules one by one and place them in the left hand side:

choose from N molecules for the first molecule choose from N – 1 for the second choose from N – 2 for the third, … choose from N – N1 + 1 for the N1-th molecule

Page 17: Entropy, probability and disorder

Binomial distribution III

Number of ways of getting N1 molecules into left half:

The macrostate doesn’t depend on the order of picking these molecules; there are N1! ways of picking them. Multiplicity is mathematical “combination”:

)!(!

)1(...)2()1(1

1 NNN

NNNNN

!)!(!

111 NNNN

N

NC

Page 18: Entropy, probability and disorder

Verification

Look at a gas with molecules A,B,C,D,E.

Look at the number of ways of putting 2 molecules into the left half of the vessel.

So: N = 5, N1 = 2, N – N1 = 3

Page 19: Entropy, probability and disorder

Verification II

The first molecule is A, B, C, D, or E.

Pick the second molecule. If I first picked A then I can now pick B, C, D or E, etc:

AB BA CA DA EA

AC BC CB DB EB

AD BD CD DC EC

AE BE CE DE ED

That is possibilities

!3!5

12312345

45

Page 20: Entropy, probability and disorder

Verification III

In the end I don’t care which molecule went in first. So all pairs AB and BA, AC and CA, etc, really correspond to the same situation. We must divide by 2!=2.

AB A

B=

Page 21: Entropy, probability and disorder

Binomial distribution plotted

Look at N=4, 10, 1000:

0 1 2 3 4

P(N1)

N1

0 2 4 6 8 10

N1

0 200 400 600 8001000

N1

Page 22: Entropy, probability and disorder

Probability and equilibrium

As time elapses, the molecules will wander all over the vessel

After a certain length of time any molecule could be in either half with equal probability

Given this situation it is overwhelmingly probable that very nearly half of them are in the left half of the vessel

Page 23: Entropy, probability and disorder

Second Law of Thermodynamics

Microscopic version:

If a system with many molecules is permitted to change in isolation, the system will evolve to the

macrostate with largest multiplicity and will then remain in that macrostate

Spot the “arrow of time”!

Page 24: Entropy, probability and disorder

Boltzmann’s Epitaph: S = k logW

Boltzmann linked heat, temperature, and multiplicity (!)

Entropy defined by S = k ln

W: multiplicity; k: Boltzmann’s constant

= “dimensionless entropy” = ln

Page 25: Entropy, probability and disorder

Second Law of Thermodynamics

Macroscopic version:

A system evolves to attain the state with maximum entropy

Spot the “arrow of time”!

Page 26: Entropy, probability and disorder

Question 1

Is entropy a state variable?

a) Yes

b) No

c) Depends on the system

Page 27: Entropy, probability and disorder

Question 2

The total entropy of two systems, with respective entropies S1 and S2, is given by

a) S = S1 + S2

b) S = S1 · S2

c) S = S1 – S2

d) S = S1 / S2

Page 28: Entropy, probability and disorder

Entropy and multiplicity

Motion of each molecule of a gas in a vessel can be specified by location and velocity multiplicity due to location and velocity

Ignore the velocity part for the time being and look at the multiplicity due to location only

Page 29: Entropy, probability and disorder

Multiplicity due to location I

Divide the available space up into c small cells. Put N particles inside the space: =cN.

For c=3, N=2: =32=9

AB

A B

A B

AB

AB

A B AB

AB

AB

Page 30: Entropy, probability and disorder

Multiplicity due to location II

Increasing the available space is equivalent to increasing the number of cells c.

The volume is proportional to the number of cells c

Hence VN

Page 31: Entropy, probability and disorder

“Slow” and “fast” processes

Slow processes are reversible: we’re always very close to equilibrium so we can run things backwards

Fast processes are irreversible: we really upset the system, get it out of equilibrium so we cannot run things backwards (without expending extra energy)

Page 32: Entropy, probability and disorder

Slow isothermal expansion

Slow isothermal expansion of ideal gas; small volume change

“velocity part” of multiplicity doesn’t change since T is constant

N

N

N

VV

V

VV

1initial

final

V

V

Page 33: Entropy, probability and disorder

Slow isothermal expansion (2)

Use the First Law:

Big numbers take logarithm

V

V

VkTVN

VpQ

NN

NkTQ

VV

11initial

final

Page 34: Entropy, probability and disorder

Slow isothermal expansion (3)

Manipulation:

or

1ln

1lnlninitial

final

kTQ

NkTQ

NNkTQ

N

NkTQ N

V

V

TQ

kk initialfinal lnln

Page 35: Entropy, probability and disorder

Slow isothermal expansion (4)

Use definition of entropy:

valid for slow isothermal expansion

V

V

TQ

SSS

TQ

kk

initialfinal

initialfinal lnln

Page 36: Entropy, probability and disorder

Example

To melt an ice cube of 20 g at 0 °C we slowly add 6700 J of heat. What is the change in entropy? In multiplicity?

24.5 J K-1; 000,000,000,000,000,000,000,770

initial

final 10

Page 37: Entropy, probability and disorder

Very fast adiabatic expansion

Expand very rapidly into same volume V+V which is now empty

Isothermal: same #collisions, #molecules, etc.

Entropy change:

NO! Entropy is a state variable and therefore

S = same as for slow isothermal expansion

???0TQ

S

TQ

S

Page 38: Entropy, probability and disorder

Slow adiabatic expansion

Same volume change, but need to push air out of the way so temperature drops

Again we ask:YES!

The “location part” of multiplicity increases as with slow isothermal expansion

The “velocity part” decreases as temperature drops The two exactly cancel

???0TQ

S

Page 39: Entropy, probability and disorder

Constant volume process

Heat is added to any (ideal or non-ideal) gas whose volume is kept constant. What is the change in entropy?

Integrate (assuming CV is constant)

TTnC

TdQ

STnCQ VV

dd ;dd

1

2lnd

d2

1

2

1TT

nCT

TnCSS V

T

T

VT

T

Page 40: Entropy, probability and disorder

Constant pressure processes

Heat is added to an ideal gas under constant pressure. What is the change in entropy?

a) b)

c) d) 0

1

2lnTT

nCS V

1

2lnVV

nCS p

1

2lnpp

nCS p

Page 41: Entropy, probability and disorder

Entropy and isothermal processes

An ideal gas expands isothermally. What is the change in entropy?

Constant temperature so

First Law: (done previously)

Therefore

TQ

S

1

2lnVV

nRTWQ

2

1

1

2 lnlnpp

nRVV

nRS

Page 42: Entropy, probability and disorder

Entropy and equilibrium

We have established a link between multiplicity and thermodynamic properties such as heat and temperature

Now we see how maximum entropy corresponds to maximum probability and hence to equilibrium

Page 43: Entropy, probability and disorder

Equilibrium volume

In general the number of microstates depends on both the volume available and the momentum (velocity) of the molecules

Let’s ignore the momentum part and look at the spatial microstates only.

Page 44: Entropy, probability and disorder

Equilibrium volume II

Say we have 3 molecules in a vessel which we split up into 6 equal parts. A partition can be placed anywhere between the cells. One molecule is on the left-hand side, the other two on the right-hand side. What is the equilibrium volume?

Look for maximum entropy!

Page 45: Entropy, probability and disorder

Equilibrium volume III

Number of cells on the left c1, on the right c2.

We’ll look at c1=4, c2=2:

A

A

A

A

BC

B C

C B

BC

Page 46: Entropy, probability and disorder

Equilibrium volume IV

Left: 1= c1=4.

Right: 2= (c2)2=4.

= ln 4 + ln 4 = ln 16 = 2.77

A

A

A

A

BC

B C

C B

BC

Page 47: Entropy, probability and disorder

Question

The dimensionless entropy of this system of 6 cells and one partition dividing it into c1 and c2 cells is

a) = ln (c1+ c2)

b) = ln (c1+ c22)

c) = ln c1+ln 2c2

d) = ln c1+2·ln c2

Page 48: Entropy, probability and disorder

Equilibrium volume V

c1 P()

1 25 3.22 0.25

2 32 3.47 0.31

3 24 3.18 0.24

4 16 2.77 0.16

5 5 1.61 0.05

total 102 1

Page 49: Entropy, probability and disorder

Maximum entropy and probability

1 2 3 4 50

1

2

3

4

dimensionless entropy 1

2

c1

1 2 3 4 50.0

0.1

0.2

0.3

0.4

Probability

c1

Page 50: Entropy, probability and disorder

Maximum probability

Probability maximum coincides with entropy maximum

Volume V1 = c1·V where V is the cell size

Most likely situation when

Same density on both sides:21

42

;21

1

1

1

1 VN

VN

0dd

1

V

Page 51: Entropy, probability and disorder

Question

Which relationship holds for the probabilities of finding a the system in a microstate corresponding to c1=2,3,4 ?

a) P(2) < P(3) < P(4)

b) P(2) = P(3) = P(4)

c) P(2) > P(3) > P(4)

Page 52: Entropy, probability and disorder

Entropy and mixing

Suppose we remove the partition. What is the entropy of this system?

Answer: ln 63 = ln 216 = 5.38

The additional entropy of 1.91 is called “the entropy of mixing”

Page 53: Entropy, probability and disorder

Generalisation I

Look at N1 particles occupying c1 cells on the left, N2 particles occupying c2 cells on right. Volume of each cell = V.

Multiplicity:

N

NN

NN

NN

V

VVV

VV

VV

cc

)(

)( 21

21

21

11

21

2121

V1 V2

N1 N2

Page 54: Entropy, probability and disorder

Generalisation II

Entropy:

Maximum entropy for equal densities:

V1 V2

N1 N2

VNVVNVNV

VVVN

NN

ln)ln(ln)(

)(lnln

1211

1121

2

2

1

1

1

2

1

1

1 0

dd

VN

VN

VVN

VN

V

Page 55: Entropy, probability and disorder

Multiplicity and energy

According to quantum mechanics, atoms in a crystal have energies 0, , 2… (This is called the Einstein model of solids)

Say we have three atoms with total energy 3

Microstates are distinguished by the different energies E1, E2, E3.

Page 56: Entropy, probability and disorder

The microstates

Energy

23

0

23

0

23

0

23

0

E1=3

E1=2

E1=

E1=0

E1,E2,E3 E1,E2,E3 E1,E2,E3 E1,E2,E3

Page 57: Entropy, probability and disorder

Question

What is the probability for any of these three atoms to have energy 0? ? 2? 3?

a) 4/10,3/10,2/10,1/10

b) 1/4,1/4,1/4,1/4

c) 1/10,2/10,3/10,4/10

d) not sure

Page 58: Entropy, probability and disorder

Generalisation

If there are n atoms and the total energy is q, then the number of microstates is given by

Works for previous example (n=3,q=3):

)!1(!)!1(

),(

nqnq

nq

10!2!3

!5

Page 59: Entropy, probability and disorder

Partition I

Look at 10 particles, with energy 20.

n1=3 particles on the left-hand side, n2=7 on right-hand-side

What is the most likely energy distribution? Plot as a function of q1.

Page 60: Entropy, probability and disorder

Partition II

0 2 4 6 8 10 12 14 16 18 20

0

500000

1000000

against energy on left-hand side

1

2

q1

Page 61: Entropy, probability and disorder

Partition III

0 2 4 6 8 10 12 14 16 18 200

5

10

15

Entropy against energy on left-hand side

1

2

q1

Page 62: Entropy, probability and disorder

Partition IV

You expect the atoms on the left to have the same energy on average as the atoms on the right

Calculation/plot shows this: maximum for

2

2

1

121

21

11 )(

nq

nq

qqnn

nq

Page 63: Entropy, probability and disorder

Entropy, energy, temperature I

The internal energy on the left U1 = q1.

Equilibrium/entropy maximum when

Use = 1 + 2:

Use U2 = U – U1:

0dd

1

U

0dd

dd

1

2

1

1 UU

1

2

2

1

1

2

2

2

dd

dd

dd

dd

UU

UUU

Page 64: Entropy, probability and disorder

Entropy, energy, temperature II

It follows that the most likely distribution of energy corresponds to a situation where

We know that in this situation T1 = T2

Clearly the two are linked!

2

2

1

1

2

2

1

1

dd

dd

or dd

dd

US

US

UU

Page 65: Entropy, probability and disorder

Entropy, energy, temperature III

Remember: we kept V, N constant so the only way in which energy could be exchanged was through heat transfer

Remember:

only heating todue

TU

TQ

S

parameters external fixedonly heating todue

1

US

US

T

Page 66: Entropy, probability and disorder

Entropy, energy, temperature IV

In our example the only external parameter was the volume

In general, gravitational, electric or magnetic fields, elastic energy etc. could all change

The definition of temperature only holds if all of these are held fixed

Page 67: Entropy, probability and disorder

PS225 – Thermal Physics topicsThe atomic hypothesisHeat and heat transferKinetic theoryThe Boltzmann factorThe First Law of ThermodynamicsSpecific HeatEntropyHeat enginesPhase transitions