# How does entropy relate to chaos theory?

##### 1 Answer

**Entropy** is in general a measure of "disorder". It's not exactly a good definition per se, but that's how it's generally defined. A more concrete definition would be:

#color(blue)(DeltaS = int1/T delq_"rev")# where:

#q_"rev"# is the reversible (i.e. most efficient)heat flow#T# istemperature#S# isentropy

The **path**(-dependent) **function**. Entropy, however, is a path independent function.

**CHAOS THEORY**

**Chaos theory** basically states that a system where *no**randomness* is involved in generating future states in the system ** can** still be

*unpredictable*. We do not need to get into the definition of what makes a chaotic system, because that is way outside the scope of the question.

An example of a chaotic system is when you work with numbers in computer programming that are near **machine precision** (just borderline too small, basically); they will be extremely difficult to keep ** entirely** unchanged, even if you are just trying to print out a specific small number (say, near

So if you try to print

#2.7634757416249547xx10^(-16)# #9.6239678259758971xx10^(-16)# #7.2345079403769486xx10^(-16)#

...etc. That makes this chaotic system unpredictable; you expect

**CHAOS THEORY VS. ENTROPY**

Essentially, the basic tenents of chaos theory that relate to entropy is the idea that **the system leans towards "disorder"**, i.e. something that is unpredictable. (It is NOT the second law of thermodynamics.)

**This implies that the universe is a chaotic system.**

If you drop a bunch of non-sticky balls on the ground, you cannot guarantee that they will stay together AND fall onto the same exact spot each time, AND stay in place after falling. It is entropically favorable for them to separate from each other and scatter upon hitting the ground.

That is, you cannot predict

how they will fall.exactly

Even if you made them stick to each other, the *balls system* **decreased** in entropy simply from falling and becoming a system *separate* from the human system, and the *human system* has **decreased** in entropy when the balls left his/her hands.

Less microstates available to the system = smaller entropy for the system.

Additionally, the *universe* has now **increased** in entropy because the number of systems considered has *doubled* (you + balls). It's always accounted for in some way, somehow.

**SO THEN HOW CAN ENTROPY BE A STATE FUNCTION, IF IT FOLLOWS CHAOS THEORY?**

**It has been proven before that entropy is a state function.**

That is, we can determine the initial and final state without worrying about the path used to get there. This is comforting because in a **chaotic system**, we cannot *necessarily* predict the final state.

But if we ** already know** the final state we want to get to (that is, we choose it ourselves), the state function property of entropy allows us to assume that whatever path we used

**so long as it generates the**

*doesn't matter***final state we want.**

*exact***Knowing the final state ahead of time overcomes the basic tenents of chaos theory.**