Give a brief description about ENTROPY in physics?

1 Answer
Feb 11, 2018

Originally, entropy was used to describe the waste heat or loss of energy loss from heat engines and other mechanical devices which converted energy into work and could never run with 100% efficiency.

Entropy is a measure of the disorder or randomness in a system under reference.

The term disorder was first used in late 19th Century by Boltzmann while he used probability theory to describe molecular movement at the microscopic level and developed statistical views of entropy. Higher the disorder, the higher is the entropy of the system.

Recently, chemistry and physics textbooks describe entropy as energy dispersal. It can be also considered as energy dispersal via involvement of dispersal of energetic particles.

Concept of Entropy has also been applied to Information Theory as Shannon entropy, which is a measure of the unpredictability or of information content of a message source.

Other applications of the concept of Entropy and mathematics developed in statistical thermodynamics are

  1. Entropy in encoding.
  2. Entropy in computing.
  3. Entropy in astrophysics.
  4. Entropy in anesthesiology.
  5. Entropy of proteins
  6. Entropy in ecology.
  7. Social entropy.
  8. Entropy in evolution of life.