ENTROPY AND INTERDISCIPLINARITY

coralmonkeyMechanics

Oct 27, 2013 (3 years and 11 months ago)

95 views

ENTROPY AND INTERDISCIPLINARITY


Nada Orlić, Ivana Jelovica Badovinac



Faculty of Arts and Sciences, University of Rijeka, Croatia


Corresponding author’s e
-
mail
:

norlic@ffri.hr



1

Introduction

Lectures from thermo
dynamics in introductory physics usually begin with the
concepts of temperature, heating, internal energy

and the first law of thermodynamics.
Then follow the second law of thermodynamics and idea of entropy. The entropy is
the most important term in therm
odynamics, but at the same time its concept

is rather
difficult to understand, that in the year 1971. John von Neumann was published the
article in Scientific American with a witty sentence about this term: ”No one knows
what (thermodynamic) entropy is, so

in debute you will always have the advantage.”


The students meat this concept in higher classes of secondary school, and later
on the universities. In the frame of seminars they can discuss, for example, about high
ordered biological systems in the conne
ction with the concept of entropy. Moreover,
there are some problems, that students can observe the relationship

between
information and entropy. The other problems they will treat using maximum entropy
principle as a general method applicable to determina
tion of probability distribution
based on partial information.


There is no straightforward thermodynamic (macroscopic) defin
ition of
entropy, and generally the
entropy in thermodynamics is extensive quantity associated
with system in equilibrium, introduc
ed using entropy change. Although, such
definition is not very concrete, it is very useful to explain
different appearances
connected
with second law of thermodynamics. The microscopic conception of
entropy, where the number of microstates is associated wi
th a macrostate, is clearly
brought home with the use of a statistical analysis of the molecular state of system.
The entropy as a measure of molecular disorder offers the information about direction
of time. An increase in entropy indicates an increase in

disorder. During spontaneous
processes in isolated systems the entropy increases and reaches the maximum value
when the entropy of nonisolated system can never decrease.


Such concept point out at interdisciplinary character of entropy and it will be
used

for explanations of different problems in various fields of interest.



2 Entropy and photosynthesis

Because of complex and highly
-
energetic compounds, compared to the starting
materia
ls of carbon dioxide and water,
formed in photosyntesis, the conclusio
n is
often drawn that there is decrease in entropy in that process. This is looking at only
one half of the participants in the reaction


the atom from carbon dioxide
and
water
that have been reorganized. But the energy change is dispersed in whole system

of
both,

the plant and its surrounding, that determines the change of entropy.





Fig

1
.

Shematic presentation of photosynthesis


The chemical reaction for the process is given by following expression:




(1)



It is necessary
to notice
that students,

in discussions with each other
and
with
their teacher, can conclude that photosynthesis is a net increase in entropy for the
overall process that includes both, the plant and incident sunlight.



3 Entrop
y and information

The statistical interpretation of entropy as well as information is quantitatively defined
in terms of probability. Since information is a measure of order and con
c
ept of
entropy is a measure of disorder, it is logical
to conclude
that sh
ould
exist

a
relation
ship

between these terms.


The Shannon information



(
2
)



and Gibbs entropy



(
3
)


are formally the same, ex
c
ept for a constant factor. Consequently, the entropy is
measure of our

lack of information about the microstate of a system.


As an example,
students
consider a table of climate data

for
two
Croatian
cities
:
Rijeka and Hvar
.
Suppose that we have recorded only whether it has rained or
not on a given day, for 8000 days. A zero

signifies that there was not rain, and a one
signifies that it rained. For simplicity, we will assume that every day is independent of
the previous days, and there is a 50% probability that it will rain on every day. A
typical record for Rijeka might look

like

00111011001011100011001100001111011010111101000001001000100101100...

The information content of this data set is 8000 bits, 1 bit per day.

8 bits is 1 byte, so
there would have to be 1 kilobyte of space on a hard disk to store the data. Since the
dat
a is random and without pattern, there is almost certainly no way to compress it to
less than 1 kilobyte.

A typical record for Hvar could be

0000000000010000000000000000000000000010000000000000000000.........

We will again suppose that every day is indepen
dent, but that it rains only 1 day out
of 31 on the average. So, we would need the same 1 kilobyte for 8000 days

of
weather, but this record will be dominated by zeroes.

If we use the information that
rain is rare, there are more compact ways of storing t
his data.


The Shannon information has an additive property.
For the example of weather
data, for one day there are two possible messages
: 0 for no rain and 1 for some rain
,
with probabilities p
0

and p
1
. For Rijeka, p
0

= p
1

= ½, so according
to the Shannon

formula I = 1. For Hvar, p
0

= 30/31 and p
1

= 1/31, so I =0.21 bits per day.

A discussion of the relationship between information and entropy gives
students an interdisciplinary persp
ective by showing that concepts
central to statistical
physics also appea
r in fields such as electrical engineering,
computer science,
statistics and other related areas of investigation
.



4 Maximum entropy principle

The maximum entropy principle is equivalent to the principle of minimum energy and
it plays a similar rol
e in
thermodynamics as extremum principles in mechanics. The
principle of maximum entropy is a general method to assign values to probability
distributions on the basis of partial information.


Two student’s groups were participated in testing of maximum entrop
y
principle.

a)


A partial information about final physics test from all secondary school is given to
students of physics in first group: There are 100 points
; 25 for the thermodynamics,
30 for mechanics and 45 for all other parts of physics. A correspondi
ng probabilities
are p(T), p(M) and p(O), respectively.
Moreover, they have information that the
whole average test score is
55
.


They have determined number of points in each part of physics test as well as
probability distribution that 28 points is obtai
ned from certain part of given test. The
following constraints have been used:



(
4
)


(
5
)


with equations relevant for “or
-
or” or “and
-
and” probabilities to express one of p(T),
p(M) and p(O) by remaini
ng two of them.


Generally, the entropy is given as


, where x
i

= T, M and O.

(
6
)


This expression is necessary to obtain three equations with three variables. So, they
found the relevant probability when the entropy has a maximum
value. The other
probabilities are then easy to find.

b) The other group of
chemistry, physics and history of art students will use maximum
entropy principle to determine origin of numismatic samples, comparing the results of
spectra, analysed by different

spectroscopic methods, with the adequate parameters of
known samples.



5 Conclusion

The
concepts of entropy and maximum entropy principle
are

applicable
i
n many

different and interesting examples

from natural, social and other fields of human life
.
Acco
rding to our experience
, it can be a good motivation for students, or a nice final
paper of interdisciplinary character.




6


References

[1]
Roy

B.

N.,
Fundamentals of classical and statistical thermodynamics

,
(New
York: Wiley)
, 2002.

[
2
]
Paar V., Šips
V.,
Fizika 2
, Termodinamika i elektromagnetizam,

Školska knjiga,
Zagreb, 1997.


[
3
]
Machta J.,
Entropy, information, and computation
, Am. J. Phys.
67

(12) 1074,
1999.

[
4
]
Jaynes E. T.
, Information theory and statistical mechanics
, Phys. Rev.,
106

(4)
620,

1957.

[
5
]

(
http://www.cs.cmu.edu/
)

[
6
]
(
http://www
-
mtl.mit.edu/Courses/6.050/notes/chapter9.pdf
)