How would you define entropy of a system?
1 Answer
We can look at it from another perspective:
Entropy can be defined as the capacity for a system to acquire motion due to a temperature change.
(The smaller the value, the more ordered.)
Here is what I mean. Let's say the temperature was closer to absolute zero. Then almost all the particles in the system will be very still. So, when you inject some heat into the system, it is easy to affect many particles and get them moving, even if just a little bit (thus generating many new microstates, each drastically different from a motionless microstate).
But what if the temperature was something like
(There is a proof for establishing that the average of the macroscopic observations represent microscopic events, but that's already been proven by McQuarrie.)
Mathematically, we can show this "capacity for a system to acquire motion due to a temperature change" like so:
#DeltaS = int_(T_1)^(T_2) (C_P)/TdT#
#= color(blue)(C_P ln\frac{T_2}{T_1})#
So, let's say the heat capacity was
#DeltaS_("1 K")^("2 K") = ln\frac(2)(1) ~~ 0.693#
#DeltaS_("273 K")^("274 K") = ln\frac(274)(273) ~~ 0.00366#
So, the entropy changes less per unit temperature at higher temperatures. Macroscopically speaking, the less the entropy changes per unit temperature, the less easily it acquires motion because it is more disordered already.