Wednesday, September 17, 2014

On Entropy

               One of the troubles I find with science education in society is thinking we understand things because we've oversimplified them. This is one of my biggest qualms with most things I hear from non-scientists or even a good number of physical science teachers. One of the biggest oversimplifications I hear is the idea that entropy=disorder. I can’t remember how many dozens of times I have heard this and then seen people take it and run with it. I've even seen some chemistry teachers post videos on YouTube preaching this idea in the completely wrong context. I have seen a quite a few videos lately as well as this gem that has been floating around facebook. While entropy does have to do with disorder in a sense of the word the common understanding of disorder (like a messy room) has nothing to do with entropy. So, I’ve decided to clear up the misconception and attempt to prevent people from associating the word entropy with disorder. If you get lost in the middle you can go ahead and skip to the bottom few paragraphs since I’m sure there is no one that’s actually dedicated enough to read this whole post (if you understand entropy you have no need, if you don’t you probably have no desire).
                First let me explain the simple example of disorder that relates to entropy (the one that I’ve heard most). Think of a room full of nicely organized things, then over time you use the stuff without putting it back. You wear your socks and throw them on the floor, the desk has pens and pencils randomly laying everywhere, and your bed is no longer nicely made. Many people would call this a high state of entropy because of the level of disorder of things in the room. Since the stuff is not neatly organized with must have high entropy. This actually has nothing to do with the actual definition of entropy. Mathematically (which is really the best way to describe it) the entropy of a system is a measure of how many microstates are possible within a single macrostate. For those of us who don’t speak geek I will explain. A macrostate describes the contents of the system and a microstate describes how those contents are arranged. Let me give an example so it makes sense. If I have 6 coins on a table there would be 6 possible macrostates: 6 coins heads up, 5 coins heads up and 1 coin tails up, 4 coins heads up and 2 coins tails up, 3 coins heads up and 3 coins tails up, etc… The microstate would then describe which of the coins would be heads up and tails up. Now that we have that all covered let’s talk about the entropy of this system.
                Like I said before entropy is a measure of how many possible microstates there are. High entropy correlates to a high number of microstates, and low entropy is a low number of microstates. To make things easier to follow I’ll take the number of coins down to four and let’s say they’re all different (a penny, a nickel, a dime, and a quarter). Let’s say I’m OCD and I like to have all my coins facing heads up on the table. I have created a system with very low entropy statistically. That is because there is only one way to arrange the macrostate (all coins heads up) which is to have all the coins heads up. So therefore the number of possible microstates in this macrostate is one. Now lets say I have a really noisy neighbor who has a huge subwoofer which shakes my table while I’m gone. The coins will now start flipping on the table all day. If I were to make any bets on how the coins would be arranged I would say that there would be two coins heads up and two coins heads down. Why? one might ask. It is because two coins heads up is the macrostate which has the most number of possible microstates which means that it has the highest level of entropy in the statistical sense of the word. Let’s look at it closer. I’ve made a table to make it easier to follow. You should note that there are 6 different ways of organizing these coins with two coins heads up and two coins tails up. That’s 5 more microstates or ways of arranging the coins than having them up heads up or all tails up! So while I would know to bet on the macrostate, I would also know not to bet on the microstate as my odds would be lower.

Penny
Nickel
Dime
Quarter
Heads
Heads
Tails
Tails
Heads
Tails
Heads
Tails
Heads
Tails
Tails
Heads
Tails
Heads
Heads
Tails
Tails
Heads
Tails
Heads
Tails
Tails
Heads
Heads

Just to convince that this is the highest statistical entropy state let’s look at the number of ways you can arrange the coins with three coins heads up. The number of possible microstates of this macrostate of the system is 4. That’s still more ways to arrange the coins than all heads up, but less than two heads up and two tails up.

Penny
Dime
Nickel
Quarter
Heads
Heads
Heads
Tails
Heads
Tails
Heads
Heads
Heads
Heads
Tails
Heads
Tails
Heads
Heads
Heads

This phenomenon is also seen in the statistics of flipping a single coin many times. If you flip a coin fifty times you should notice that on average you will get 25 tails and 25 heads. The macrostate would be flipping a coin 50 times and getting 25 tails and 25 heads. The microstate would be the order in which you got the 25 tails and 25 heads. If one were to flip 25 heads in a row it wouldn’t be defying statistics because each flip has a 50% chance of either heads or tails and the previous flip has no affect on the next flip statistically. It would be a rare thing to see however because it would be one of about 126 trillion different ways of obtaining a 50% heads to tails ratio from 50 flips. While it is true that this coin flip example does not happen exactly 50% heads and tails 100% of the time, if one were to take an average of all the unbiased coin flips in the world it should average to very close to that (to the point of being in practical language exactly 50%). Now this is just a statistical explanation of entropy, how does this apply in the real world?
Well it is easiest to see in the sense of an ideal gas system. This is a system made of gas particles that act like billiard balls in a container that see no gravity, no air resistance, no friction, and the collisions are purely elastic (the system does not lose any energy in the collisions). Let’s say I have all my billiard balls arranged so that I have slow moving balls on one side of the container and fast moving balls on the other side of the container with some kind of magical barrier in between. Now before we continue with this box example I can tell you that on average all the balls on one side will have the same average speed. How? Because that is the state with the highest entropy, and the second law of thermodynamics states that a system will always increase in entropy until it reaches maximum entropy. This maximum entropy is often referred to as thermodynamic equilibrium. When one ball collides with another it will transfer some of its kinetic energy to the other ball. After a long time all of the balls will have collided with each other multiple times each. Each collision causes each ball to essentially share its kinetic energy with the other balls. Eventually they will have come to a point where they all have the same average kinetic energy. If this is true than there are many ways of arranging this system since I can say that any ball can be put in any other balls place. If only one ball had all the kinetic energy then that ball is the only ball that I could rearrange which limits my number of microstates.
So now we go back to the box with the magical barrier. Let’s say that there are x number of microstates for each side of the box. That means there are 2 times x number of total microstates in the box since the only way I can rearrange this system is by either putting the fast balls on the right or the slow balls on the right. Now I take the barrier out, what happens? The fast balls start colliding with the slow balls and after a certain amount of time I have all the balls with the same average speed again. That average speed is faster than the average speed of the original slow balls, but slower than the average speed of the original fast balls. That means that the balls could all be rearranged to replace any other ball in the box which correlates to high entropy. Now why would we call this disorder? It is because this system we know very little about. All I can tell you is the average speed of each ball in the box. Before I took the barrier away I was able to tell you a little about the two different average speeds (the slow and the fast).
Now let’s say I rewind time even further and say initially I had only one ball that was moving on each side of the barrier. Well then I can tell you that only two balls are moving at some exact speed, and the rest of the balls are at rest. You see initially I had a highly “ordered” system because I had only two balls with any kinetic energy and I knew exactly what their energy was. After some time the balls bounced around and collided with other balls causing my information about the system to be even vaguer. I do not know anything about any one particular ball now, but I can split the box in two and know that in each half the balls will have some average speed. Now when I take away the barrier I know even less about the system, only that all the balls now have one average speed. One could conclude then that as entropy increases the knowledge one has about a particular system decreases.
This doesn’t mean that 1010 is a low entropy organization of ones and zeros because it repeats a pattern (and therefore probably contains some good information). It merely means that if I had four binary digits that were changing randomly then after some time I would end up with a collection of arrangements of two 1s and two 0s (which includes 1010 as well as 0101, 0011, 1100, and 1001). That means that I know have to guess between 6 options now rather than if they were all 1s I would only have to guess one out of one option. So when you are trying to get rid of information on a hard drive there a repeated pattern of 10 is just as worthless to the NSA as all 1s and then all 0s which is just as worthless as a random assortment of 1s and 0s which are half 1s and half 0s. The information isn’t lost in the actual arrangement; it is lost in the number of possible arrangements. So if there are random 1s and 0s left on your hard drive, but there are more 1s than 0s it makes the NSA’s job easier because it limits the possible combinations of 1s and 0s that contained your actual information previously.
But I digress, now I will get to the whole point of this explanation. Many people say that evolution violates the second law of thermodynamics since our bodies are highly ordered organisms. That would kind of be like saying that because you are using your computer right now you are cheating the laws of physics. In order to get this point across I’m going to take in terms of mass-energy. General relativity states that mass can be converted into energy and vice-versa. This happens all the time not just in fusion in the sun and fission in power plants but in chemical reactions. You see our bodies are highly ordered mass-energy systems. But in order to make our bodies (and keep them running) we actually cause more disorder. For example we eat things which are highly ordered (plant and animal matter or even synthesized chemicals) and we use up the energy that is provided by the chemical bonds of this “fuel”. Well over time we get hungry again. That is because we used this energy and a good portion of it was lost from our body leaking body heat. That energy was partially radiated as electromagnetic radiation and partially lost due to convection in the air or conduction in your clothes/blankets/couch/whatever you are touching. That energy that left you is now “simple” or “less ordered”. This is just like our nuclear fuels which go from high energy density (“high order”) to lower energy density (“low order”). It’s the same with fossil fuels and basically anything that could be rearranged into a lower energy state. This is what is theorized by some to be the “heat death of the universe” or when the entire universe will reach thermodynamic equilibrium. All the higher ordered mass-energy of the universe could eventually all become the same low-ordered mass energy in some form. This would be like everything in the universe evaporating into radio-waves. While no one knows how the universe will end (if it ever “ends”) this goes to show that evolution has no impact on the second law. Yes it is creating “higher order” out of “lower order”, but the total “high order” being converted to “low order” was a net loss in the end.

In conclusion: entropy isn't what most people think it is, and is a very abstract thing that is hard to understand. The one thing I've learned about physics: if you think you know what you’re talking about, you probably don’t.

No comments:

Post a Comment