*** Welcome to piglix ***

Entropy in thermodynamics and information theory


There are close parallels between the mathematical expressions for the thermodynamic entropy, usually denoted by S, of a physical system in the statistical thermodynamics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s, and the information-theoretic entropy, usually expressed as H, of Claude Shannon and Ralph Hartley developed in the 1940s. Shannon, although not initially aware of this similarity, commented on it upon publicizing information theory in A Mathematical Theory of Communication.

This article explores what links there are between the two concepts, and how far they can be regarded as connected.

The defining expression for entropy in the theory of statistical mechanics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s, is of the form:

where is the probability of the microstate i taken from an equilibrium ensemble.

The defining expression for entropy in the theory of information established by Claude E. Shannon in 1948 is of the form:


...
Wikipedia

...