How would you define entropy of a system?

1 Answer
Dec 15, 2015

We can look at it from another perspective:

Entropy can be defined as the capacity for a system to acquire motion due to a temperature change.

(The smaller the value, the more ordered.)

Here is what I mean. Let's say the temperature was closer to absolute zero. Then almost all the particles in the system will be very still. So, when you inject some heat into the system, it is easy to affect many particles and get them moving, even if just a little bit (thus generating many new microstates, each drastically different from a motionless microstate).

But what if the temperature was something like #25^@ "C"# (#"273 K"#)? Many of the particles are readily moving fairly quickly already. So, the addition of the same amount of heat as in the example near absolute zero would induce a change in the motion of each particle that is, macroscopically speaking, quite small (indicating the presence of many similar microstates, all vibrating similarly, acquiring a negligible amount of heat, changing the microstates negligibly).

(There is a proof for establishing that the average of the macroscopic observations represent microscopic events, but that's already been proven by McQuarrie.)

Mathematically, we can show this "capacity for a system to acquire motion due to a temperature change" like so:

#DeltaS = int_(T_1)^(T_2) (C_P)/TdT#

#= color(blue)(C_P ln\frac{T_2}{T_1})#

So, let's say the heat capacity was #"1 J/g"cdot"K"#. Then, let's compare going from #1# to #"2 K"# and going from #273# to #"274 K"#.

#DeltaS_("1 K")^("2 K") = ln\frac(2)(1) ~~ 0.693#

#DeltaS_("273 K")^("274 K") = ln\frac(274)(273) ~~ 0.00366#

So, the entropy changes less per unit temperature at higher temperatures. Macroscopically speaking, the less the entropy changes per unit temperature, the less easily it acquires motion because it is more disordered already.