What is the definition of entropy?

2 Answers
May 13, 2016


Entropy by definition is the degree of randomness or disorder (chaos) in a system.


Entropy by definition is the degree of randomness or disorder (chaos) in a system.

Here is a complete lessons about entropy, I hope you find it helpful.
Thermodynamics | Spontaneous Process & Entropy.

May 13, 2016

Entropy (#S#) is a measure of the number of ways the microstates in a system can arrange themselves to form a single observable macrostate.

That is, it is a quantity that describes the number of microscopic arrangement possibilities for a given system.


#\mathbf(S = k_BlnOmega)#


  • #Omega# is the number of microstates that collectively generate the same macrostate (observable).
  • #k_B = 1.3806xx10^(-23) "J/K"# is the Boltzmann constant.

Take an ensemble (loosely-speaking, a group) of molecules that can be arranged in multiple ways.


The more ways you can arrange them, the more "disordered" they are. This corresponds with a greater #Omega# giving a greater entropy #S#.


A consistent thermodynamic definition of entropy is also:

#\mathbf(DeltaS >= (q)/T)#

(where #q_"rev"# is reversible, i.e. efficient, heat flow, #q_"irr"# is irreversible, inefficient heat flow, and #q_"irr" < q_"rev"#. Both #q_"irr"# and #q_"rev"# are contained in #q#.)

So another way you can think about it is that for a given temperature:

The more the heat that you put into the system affects the microscopic arrangement of molecules, the more "disordered" the system is.


This "disorder" is a definition of entropy you were introduced to in general chemistry, and is generalized to be greater for gases than for liquids, for instance.

Gases are more freely-moving than liquids, so gases can assume more microstates than the liquid phase of the same substance can. Thus, gases are more "disordered", and they have a higher entropy.