Entropy is a fundamental concept in thermodynamics that serves as a measure of the amount of disorder, randomness, or uncertainty in a system. It also quantifies the amount of thermal energy in a system that is not available to do useful work. The concept is central to the Second Law of Thermodynamics, which defines the "arrow of time" and explains why natural processes are irreversible.
In classical thermodynamics, the change in entropy () of a system undergoing a reversible process is defined as the amount of heat () added to or removed from the system, divided by the absolute temperature () at which the transfer occurs.
Mathematical Formulation:
Entropy is a state function, meaning its value depends only on the current state of the system, not on the path taken to reach that state.
Entropy is often described as a measure of disorder. A system with a high degree of randomness and many possible microscopic arrangements has high entropy.
The natural tendency of systems is to move from states of lower probability (order) to states of higher probability (disorder), which corresponds to an increase in entropy.

An increase in temperature increases the average kinetic energy of the molecules, causing them to move more rapidly and randomly. This greater molecular agitation increases the disorder of the system and therefore increases its entropy. Conversely, cooling a substance reduces molecular motion and decreases entropy (e.g., liquid water freezing into ice becomes more ordered).
Key point (SLO P-11-C-18): Higher temperature → greater molecular motion → greater disorder → higher entropy.
The Second Law of Thermodynamics can be stated in terms of entropy:
Statement: The total entropy of an isolated system can never decrease over time; it either stays constant or increases.
This can be expressed mathematically as:
This law dictates the direction of natural events. Heat spontaneously flows from hot to cold because this process increases the total entropy of the universe.
All isolated systems naturally evolve toward states of greater disorder. This is because disordered (high-entropy) states are statistically far more probable than ordered (low-entropy) states. For example:
This tendency is captured by the inequality .
As entropy increases, energy becomes less available to do useful work. This is called the degradation of energy.
Q: What is the "heat death" of the universe?
A: The "heat death" is a hypothetical end-state of the universe where it has reached maximum entropy. In this state, all energy would be uniformly distributed, and there would be no temperature differences. Consequently, heat could no longer flow, no work could be done, and all thermodynamic processes would cease.
Q: Can the entropy of a system ever decrease?
A: Yes, the entropy of a system can decrease, but only if the entropy of its surroundings increases by an equal or greater amount. For example, when water freezes into ice, the water itself becomes more ordered (entropy decreases), but this process releases heat into the surroundings, increasing the surroundings' disorder. The total entropy of the universe still increases.