What is the definition of entropy?
Entropy by definition is the degree of randomness or disorder (chaos) in a system.
Here is a complete lessons about entropy, I hope you find it helpful.
Thermodynamics | Spontaneous Process & Entropy.
That is, it is a quantity that describes the number of microscopic arrangement possibilities for a given system.
STATISTICAL MECHANICS DEFINITION
#\mathbf(S = k_BlnOmega)#
#Omega#is the number of microstates that collectively generate the same macrostate (observable).
#k_B = 1.3806xx10^(-23) "J/K"#is the Boltzmann constant.
Take an ensemble (loosely-speaking, a group) of molecules that can be arranged in multiple ways.
The more ways you can arrange them, the more "disordered" they are. This corresponds with a greater
A consistent thermodynamic definition of entropy is also:
#\mathbf(DeltaS >= (q)/T)#
So another way you can think about it is that for a given temperature:
The more the heat that you put into the system affects the microscopic arrangement of molecules, the more "disordered" the system is.
GENERAL CHEMISTRY DEFINITION
This "disorder" is a definition of entropy you were introduced to in general chemistry, and is generalized to be greater for gases than for liquids, for instance.
Gases are more freely-moving than liquids, so gases can assume more microstates than the liquid phase of the same substance can. Thus, gases are more "disordered", and they have a higher entropy.