Thermodynamics and the Gibbs Paradox - MDPI

bronzekerplunkΜηχανική

27 Οκτ 2013 (πριν από 3 χρόνια και 9 μήνες)

66 εμφανίσεις

Thermodynamics and
the Gibbs Paradox

Presented by: Chua Hui Ying Grace


Goh Ying Ying


Ng Gek Puey Yvonne

Overview


The three laws of thermodynamics


The Gibbs Paradox


The Resolution of the Paradox


Gibbs / Jaynes


Von Neumann


Shu Kun Lin’s revolutionary idea


Conclusion

The Three Laws of
Thermodynamics


1
st

Law


Energy is always conserved


2
nd

Law


Entropy of the Universe always increase


3
rd

Law


Entropy of a perfect crystalline substance is
taken as zero at the absolute temperature
of 0K.

Unravel the mystery
of The Gibbs Paradox


The mixing of

non
-
identical gases

Shows obvious increase in entropy (disorder)


The mixing of identical gases

Shows zero increase in entropy as action is reversible

Compare the two scenarios of
mixing and we realize that……

To resolve the Contradiction


Look at how people do this

1.
Gibbs /Jaynes

2.
Von Neumann

3.
Lin Shu Kun

Gibbs’ opinion


When 2 non
-
identical gases mix and entropy
increase, we imply that the gases can be
separated and returned to their original state


When 2 identical gases mix, it is impossible to
separate the two gases into their original
state as there is no recognizable difference
between the gases

Gibbs’ opinion (2)


Thus, these two cases stand on
different footing and should not be
compared with each other


The mixing of gases of different kinds
that resulted in the entropy change was
independent of the nature of the gases


Hence independent of the degree of
similarity between them

Entropy

S
max

Similarity

S=0

Z=0

Z = 1

Jaynes’ explanation


The entropy of a macrostate is given as

Where
S(X)

is the entropy associated with a chosen
set of macroscopic quantities

W(C)

is the phase volume occupied by all the
microstates in a chosen reference class C

Jaynes’ explanation (2)


This thermodynamic entropy
S(X)

is not a
property of a microstate, but of a certain
reference class
C(X)

of microstates


For entropy to always increase, we need to
specify the variables we want to control and
those we want to change.


Any manipulation of variables outside this
chosen set may cause us to see a violation of
the second law.

Von Neumann’s Resolution


Makes use of the quantum mechanical
approach to the problem


He derives the equation

Where


measures the degree of orthogonality, which
is the degree of similarity between the gases.


Von Neumann’s Resolution (2)


Hence when



= 0 entropy is at its highest
and when


= 1 entropy is at its lowest



Therefore entropy decreases continuously
with increasing similarity

Entropy

S
max

Similarity

S=0

Z=0

Z = 1

Resolving the Gibbs Paradox
-

Using Entropy and its
revised relation with Similarity

proposed by Lin Shu Kun.



Draws a connection between information theory and entropy



proposed that entropy increases continuously with similarity
of the gases


Analyse 3 concepts!

(1) high symmetry = high similarity,

(2) entropy = information loss and

(3) similarity = information loss.


Why “entropy increases with similarity” ?


Due to Lin’s proposition that



entropy is the degree of symmetry and



information is the degree of non
-
symmetry


(1) high symmetry = high similarity



symmetry
is a measure of
indistinguishability




high symmetry contributes to high indistinguishability



s
imilarity can be described as a continuous measure of
imperfect symmetry



High Symmetry Indistinguishability High









similarity


(2) entropy = information loss



an increase in
entropy

means an increase in
disorder
.



a decrease in entropy reflects an increase in order.



A more ordered system is more highly organized



thus

p
ossesses greater information content.

Do you have any
idea what the
picture is all about?

From the previous example,



Greater entropy would result in least information registered



Higher entropy , higher information loss

Thus if the system is more ordered,



This means
lower entropy

and thus
less information loss
.


(3) similarity = information loss.



1

Particle

(n
-
1)

particles

For a system with distinguishable particles,

Information on N particles

=
different information

of each particle

=
N pieces

of information

High

similarity

(high

symmetry)



瑨敲e



gr敡瑥r

information

loss
.



For a system with
indistinguishable particles,

Information of N particles

=
Information of 1 particle

=
1 piece

of information

Concepts explained:

(1) high symmetry = high similarity

(2) entropy = information loss and

(3) similarity = information loss

After establishing the links between the various concepts,

If a system is


highly symmetrical high similarity


Greater

information

loss

Higher
entropy


The mixing of identical

gases (revisited)

Lin’s Resolution of the Gibbs Paradox


Compared to the non
-
identical gases, we have less
information about the identical gases



According to his theory,


less information=higher entropy


Therefore, the mixing of gases should result in an
increase with entropy.


Comparing the 3 graphs

Entropy

S
max

Similarity

S=0

Z=0

Z = 1

Entropy

S
max

Similarity

S=0

Z=0

Z = 1

Z=0

Entropy

S
max

Similarity

S=0

Z = 1

Gibbs

Von Neumann

Lin

Why are there
different

ways in
resolving the paradox?


Different ways of considering Entropy



Lin

Static Entropy: consideration of
configurations of fixed particles in a system


Gibbs & von Neumann

Dynamic Entropy:
dependent of the changes in the dispersal of
energy in the microstates of atoms and
molecules



We cannot compare the two
ways of resolving the paradox!


Since Lin’s definition of entropy is
essentially different from that of Gibbs
and von Neumann, it is unjustified to
compare the two ways of resolving the
paradox.

Conclusion


The Gibbs Paradox poses problem to
the second law due to an inadequate
understanding of the system involved.


Lin’s novel idea sheds new light on
entropy and information theory, but
which also leaves conflicting grey areas
for further exploration.


Acknowledgements


We would like to thank

Dr. Chin Wee Shong for her support and
guidance throughout the semester

Dr Kuldip Singh for his kind support

And all who have helped in one way or another