Entropy
Second Law of Thermodynamics   Los Angeles, California, United States
 
 
In thermodynamics, entropy (usual symbol S) is a measure of the number of microscopic configurations that correspond to a thermodynamic system in a state specified by certain macroscopic variables. For example, gas in a container with known volume, pressure, and temperature could have an enormous number of possible configurations of the individual gas molecules, and which configuration the gas is actually in may be regarded as random. Hence, entropy can be understood as a measure of molecular disorder within a macroscopic system. The second law of thermodynamics states that an isolated system's entropy never decreases. Such systems spontaneously evolve towards thermodynamic equilibrium, the state with maximum entropy. Non-isolated systems may lose entropy, provided their environment's entropy increases by at least that increment. Since entropy is a state function, the change in entropy of a system is determined by its initial and final states. This applies whether the process is reversible or irreversible. However, irreversible processes increase the combined entropy of the system and its environment.

The change in entropy (ΔS) of a system was originally defined for a thermodynamically reversible process as
Currently Offline
Favorite Game
Comments
Surge 5 Jul, 2020 @ 6:02pm 
5AB gang