*** Welcome to piglix ***

Entropy (general concept)

Entropy
Common symbols
S
SI unit joules per kelvin (J K−1)
In SI base units kg m2 s−2 K−1

In statistical mechanics, entropy (usual symbol S) is related to the number of microscopic configurations Ω (microstates) that a thermodynamic system can have when in a state as specified by some macroscopic variables. Specifically, assuming for simplicity that each of the microscopic configurations is equally probable, the entropy of the system is the natural logarithm of that number of configurations, multiplied by the Boltzmann constant kB. Formally,

This is consistent with 19th-century formulas for entropy in terms of heat and temperature, as discussed below. Boltzmann's constant, and therefore entropy, have dimensions of energy divided by temperature.

For example, gas in a container with known volume, pressure, and energy could have an enormous number of possible configurations of the collection of individual gas molecules. At equilibrium, each instantaneous configuration of the gas may be regarded as random. Entropy may be understood as a measure of disorder within a macroscopic system. The second law of thermodynamics states that an isolated system's entropy never decreases spontaneously. Such systems spontaneously evolve towards thermodynamic equilibrium, the state with maximum entropy. Non-isolated systems may lose entropy, provided their environment's entropy increases by at least that amount. Since entropy is a function of the state of the system, a change in entropy of a system is determined by its initial and final states. This applies whether the process is reversible or irreversible. However, irreversible processes increase the combined entropy of the system and its environment.


...
Wikipedia

...