Entropy
Ice melting is a common example of "entropy
increasing"[1] described in 1862 by Rudolf
Clausius as an increase in the disgregation of
the molecules of the body of ice.[2]
Entropy articles
Introduction
History
Classical
Statistical
Entropy is a concept applied across physics,
information theory, mathematics and other
branches of science and engineering. The fol-
lowing definition is shared across all these
fields:
where S is the conventional symbol for en-
tropy. The sum runs over all microstates con-
sistent with the given macrostate and
is
the probability of the
ith microstate. The
constant of proportionality k depends on
what units are chosen to measure S. When SI
units are chosen, we have k = kB =
Boltzmann’s
constant = 1.38066×10−23
J K−1. If units of bits are chosen, then k =
1/ln(2) so that
.
Entropy is central to the second law of
thermodynamics. The second law in conjunc-
tion with the fundamental thermodynamic re-
lation places limits on a system’s ability to do
useful work.[3][4]
The second law can also be used to pre-
dict whether a physical process will proceed
spontaneously. Spontaneous changes in isol-
ated systems occur with an increase in
entropy.
The word "entropy" is derived from the
Greek εντροπία "a turning towards" (εν- "in"
+ τροπή "a turning").[5]
Definitions and
descriptions
In science, the term "entropy" is generally in-
terpreted in three distinct but semi-related
ways; from a macroscopic viewpoint (classic-
al thermodynamics), microscopic viewpoint
(statistical thermodynamics), and information
viewpoint (information theory).
The statistical definition of entropy (see
below) is the fundamental definition because
the other two can be mathematically derived
from it, but not vice versa. All properties of
entropy (including second law of thermody-
namics) follow from this definition.
Microscopic definition of en-
tropy (statistical mechanics)
In statistical
thermodynamics, entropy is
defined as
as previously discussed.
For almost all practical purposes, this can
be taken as the funda