## Entropy

• In statistical thermodynamics, entropy (usual symbol S) (Greek:Εντροπία, εν + τρέπω) is a measure of the number of microscopic configurations Ω that correspond to a thermodynamic system in a state specified by certain macroscopic variables. Specifically, assuming that each of the microscopic configurations is equally probable, the entropy of the system is the natural logarithm of that number of configurations, multiplied by the Boltzmann constant kB (which provides consistency with the original thermodynamic concept of entropy discussed below, and gives entropy the dimension of energy divided by temperature). Formally,

For example, gas in a container with known volume, pressure, and temperature could have an enormous number of possible configurations of the individual gas molecules, and which configuration the gas is actually in may be regarded as random. Hence, entropy can be understood as a measure of molecular disorder within a macroscopic system. The second law of thermodynamics states that an isolated system's entropy never decreases. Such systems spontaneously evolve towards thermodynamic equilibrium, the state with maximum entropy. Non-isolated systems may lose entropy, provided their environment's entropy increases by at least that decrement. Since entropy is a state function, the change in entropy of a system is determined by its initial and final states. This applies whether the process is reversible or irreversible. However, irreversible processes increase the combined entropy of the system and its environment.

The change in entropy (ΔS) of a system was originally defined for a thermodynamically reversible process as

${\displaystyle S=k_{\mathrm {B} }\ln \Omega .}$
${\displaystyle \Delta S=\int {\frac {\delta Q_{\text{rev}}}{T}}}$,

${\displaystyle W=\left({\frac {T_{H}-T_{C}}{T_{H}}}\right)Q_{H}=\left(1-{\frac {T_{C}}{T_{H}}}\right)Q_{H}}$

()

${\displaystyle W=Q_{H}-Q_{C}}$

()

${\displaystyle {\frac {Q_{H}}{T_{H}}}-{\frac {Q_{C}}{T_{C}}}=0}$
${\displaystyle {\frac {Q_{H}}{T_{H}}}={\frac {Q_{C}}{T_{C}}}}$
${\displaystyle W<\left(1-{\frac {T_{C}}{T_{H}}}\right)Q_{H}}$
${\displaystyle Q_{H}-Q_{C}<\left(1-{\frac {T_{C}}{T_{H}}}\right)Q_{H}}$
or
${\displaystyle Q_{C}>{\frac {T_{C}}{T_{H}}}Q_{H}}$
${\displaystyle S_{H}-S_{C}<0}$
or
${\displaystyle S_{H}
${\displaystyle S=-k_{\mathrm {B} }\sum _{i}p_{i}\log p_{i},}$
${\displaystyle S=-k_{\mathrm {B} }\mathbb {E} _{i}(\log p_{i})}$
${\displaystyle S=-k_{\mathrm {B} }{\text{Tr}}\,({\widehat {\rho }}\log({\widehat {\rho }})),}$
${\displaystyle S=k_{\mathrm {B} }\log \Omega .}$
${\displaystyle dU=TdS-PdV}$
${\displaystyle {\frac {dS}{dt}}=\sum _{k=1}^{K}{\dot {M}}_{k}{\hat {S}}_{k}+{\frac {\dot {Q}}{T}}+{\dot {S}}_{gen}}$
${\displaystyle \sum _{k=1}^{K}{\dot {M}}_{k}{\hat {S}}_{k}}$ = the net rate of entropy flow due to the flows of mass into and out of the system (where ${\displaystyle {\hat {S}}}$ = entropy per unit mass).
${\displaystyle {\frac {\dot {Q}}{T}}}$ = the rate of entropy flow due to the flow of heat across the system boundary.
${\displaystyle {\dot {S}}_{gen}}$ = the rate of entropy production within the system. This entropy production arises from processes within the system, including chemical reactions, internal matter diffusion, internal heat transfer, and frictional effects such as viscosity occurring within the system from mechanical work transfer to or from the system.
${\displaystyle \Delta S=nR\ln {\frac {V}{V_{0}}}=-nR\ln {\frac {P}{P_{0}}}.}$
${\displaystyle \Delta S=nC_{P}\ln {\frac {T}{T_{0}}}}$.
${\displaystyle \Delta S=nC_{v}\ln {\frac {T}{T_{0}}}}$,
${\displaystyle \Delta S=nC_{v}\ln {\frac {T}{T_{0}}}+nR\ln {\frac {V}{V_{0}}}}$.
${\displaystyle \Delta S=nC_{P}\ln {\frac {T}{T_{0}}}-nR\ln {\frac {P}{P_{0}}}}$.
${\displaystyle \Delta S_{\text{fus}}={\frac {\Delta H_{\text{fus}}}{T_{\text{m}}}}.}$
${\displaystyle \Delta S_{\text{vap}}={\frac {\Delta H_{\text{vap}}}{T_{\text{b}}}}.}$
${\displaystyle {\mbox{Disorder}}={C_{D} \over C_{I}}.\,}$
${\displaystyle {\mbox{Order}}=1-{C_{O} \over C_{I}}.\,}$
${\displaystyle S=-k_{\mathrm {B} }\mathrm {Tr} (\rho \log \rho )\!}$
${\displaystyle S=-k_{\mathrm {B} }\sum _{i}p_{i}\,\log \,p_{i}}$,
${\displaystyle H(X)=-\sum _{i=1}^{n}p(x_{i})\log p(x_{i}).}$
${\displaystyle H=-\sum _{i=1}^{W}p\log(p)=\log(W)}$
${\displaystyle H=k\,\log(W)}$
• a measure of energy dispersal at a specific temperature.
• a measure of disorder in the universe or of the availability of the energy in a system to do work.
• a measure of a system's thermal energy per unit temperature that is unavailable for doing useful work.
• Entropy unit – a non-S.I. unit of thermodynamic entropy, usually denoted "e.u." and equal to one calorie per Kelvin per mole, or 4.184 Joules per Kelvin per mole.
• Gibbs entropy – the usual statistical mechanical entropy of a thermodynamic system.
• Boltzmann entropy – a type of Gibbs entropy, which neglects internal statistical correlations in the overall particle distribution.
• Tsallis entropy – a generalization of the standard Boltzmann-Gibbs entropy.
• Standard molar entropy – is the entropy content of one mole of substance, under conditions of standard temperature and pressure.
• Residual entropy – the entropy present after a substance is cooled arbitrarily close to absolute zero.
• Entropy of mixing – the change in the entropy when two different chemical substances or components are mixed.
• Loop entropy – is the entropy lost upon bringing together two residues of a polymer within a prescribed distance.
• Conformational entropy – is the entropy associated with the physical arrangement of a polymer chain that assumes a compact or globular state in solution.
• Entropic force – a microscopic force or reaction tendency related to system organization changes, molecular frictional considerations, and statistical variations.
• Free entropy – an entropic thermodynamic potential analogous to the free energy.
• Entropic explosion – an explosion in which the reactants undergo a large change in volume without releasing a large amount of heat.
• Entropy change – a change in entropy dS between two equilibrium states is given by the heat transferred dQrev divided by the absolute temperature T of the system in this interval.
• Sackur-Tetrode entropy – the entropy of a monatomic classical ideal gas determined via quantum considerations.
• Atkins, Peter; Julio De Paula (2006). Physical Chemistry (8th ed.). Oxford University Press. ISBN .
• Baierlein, Ralph (2003). Thermal Physics. Cambridge University Press. ISBN .
• Ben-Naim, Arieh (2007). Entropy Demystified. World Scientific. ISBN .
• Callen, Herbert, B (2001). Thermodynamics and an Introduction to Thermostatistics (2nd ed.). John Wiley and Sons. ISBN .
• Chang, Raymond (1998). Chemistry (6th ed.). New York: McGraw Hill. ISBN .
• Cutnell, John, D.; Johnson, Kenneth, J. (1998). Physics (4th ed.). John Wiley and Sons, Inc. ISBN .
• Dugdale, J. S. (1996). Entropy and its Physical Meaning (2nd ed.). Taylor and Francis (UK); CRC (US). ISBN .
• Fermi, Enrico (1937). Thermodynamics. Prentice Hall. ISBN .
• Goldstein, Martin; Inge, F (1993). The Refrigerator and the Universe. Harvard University Press. ISBN .
• Gyftopoulos, E.P.; G.P. Beretta (2010). Thermodynamics. Foundations and Applications. Dover. ISBN .
• Haddad, Wassim M.; Chellaboina, VijaySekhar; Nersesov, Sergey G. (2005). Thermodynamics – A Dynamical Systems Approach. Princeton University Press. ISBN .
• Kroemer, Herbert; Charles Kittel (1980). Thermal Physics (2nd ed.). W. H. Freeman Company. ISBN .
• Lambert, Frank L.; entropysite.oxy.edu
• Müller-Kirsten, Harald J.W. (2013). Basics of Statistical Physics (2nd ed.). Singapore: World Scientific. ISBN .
• Penrose, Roger (2005). The Road to Reality: A Complete Guide to the Laws of the Universe. New York: A. A. Knopf. ISBN .
• Reif, F. (1965). Fundamentals of statistical and thermal physics. McGraw-Hill. ISBN .
• Schroeder, Daniel V. (2000). Introduction to Thermal Physics. New York: Addison Wesley Longman. ISBN .
• Serway, Raymond, A. (1992). Physics for Scientists and Engineers. Saunders Golden Subburst Series. ISBN .
• Spirax-Sarco Limited, Entropy – A Basic Understanding A primer on entropy tables for steam engineering
• vonBaeyer; Hans Christian (1998). Maxwell's Demon: Why Warmth Disperses and Time Passes. Random House. ISBN .
Wikipedia