Don't miss the piglix.com special BONUS offer during our Beta-test period. The next 100 new Registered Users (from a unique IP address), to post at least five (5) piglix, will receive 1,000 extra sign-up points (eventually exchangeable for crypto-currency)!

* * * * *    Free piglix.com Launch Promotions    * * * * *

  • Free Ads! if you are a small business with annual revenues of less than $1M - piglix.com will place your ads free of charge for up to one year! ... read more

  • $2,000 in free prizes! piglix.com is giving away ten (10) Meccano Erector sets, retail at $200 each, that build a motorized Ferris Wheel (or one of 22 other models) ... see details

Entropy


In statistical thermodynamics, entropy (usual symbol S) (Greek:Εντροπία, εν + τρέπω) is a measure of the number of microscopic configurations Ω that correspond to a thermodynamic system in a state specified by certain macroscopic variables. Specifically, assuming that each of the microscopic configurations is equally probable, the entropy of the system is the natural logarithm of that number of configurations, multiplied by the Boltzmann constant kB (which provides consistency with the original thermodynamic concept of entropy discussed below, and gives entropy the dimension of energy divided by temperature). Formally,

For example, gas in a container with known volume, pressure, and temperature could have an enormous number of possible configurations of the individual gas molecules, and which configuration the gas is actually in may be regarded as random. Hence, entropy can be understood as a measure of molecular disorder within a macroscopic system. The second law of thermodynamics states that an isolated system's entropy never decreases. Such systems spontaneously evolve towards thermodynamic equilibrium, the state with maximum entropy. Non-isolated systems may lose entropy, provided their environment's entropy increases by at least that decrement. Since entropy is a state function, the change in entropy of a system is determined by its initial and final states. This applies whether the process is reversible or irreversible. However, irreversible processes increase the combined entropy of the system and its environment.

The change in entropy (ΔS) of a system was originally defined for a thermodynamically reversible process as



...
Wikipedia

1,000 EXTRA POINTS!

Don't forget! that as one of our early users, you are eligible to receive the 1,000 point bonus as soon as you have created five (5) acceptable piglix.

...