Natural Science #10: Entropy

Back in the 19th century, at the height of the industrial revolution, engineers were pondering questions about the efficiency of steam engines. From those questions, arose the science of thermodynamics. You might be familiar with the laws they came up with:

  1. Energy cannot be created nor destroyed in an isolated system.
  2. The entropy of any isolated system always increase.
  3. The entropy of a system approaches a constant value as the temperature approaches absolute zero.

Today’s post will explore that second law of thermodynamics, which also happens to be the most important for understanding time and the evolution of the universe.

Cracking Eggs

Imagine you dropped an egg on the floor; as you would expect, the egg shatters into a messy pool of yoke and shell. Now, imagine that pool spontaneously recombines into its original structure and fly’s back into your hand. That wouldn’t be expected, but there is nothing in the laws of physics that prohibits it.

Imagine looking at the same event at the microscopic level. When the egg shatters, some atoms would flow in one direction. When the egg rearranged itself, they would just flow in the other direction. It would look perfectly normal. But at the macroscopic scale, it still would look like witchcraft.

Consider another example; imagine a sand castle in the desert. Every grain of sand is arranged in a delicate pattern to form the structure of a medieval fortress. Over the next few hours, the wind blows the castle into an uninteresting pile of sand. There is no law of physics that says the wind couldn’t have blown an unteresting pile of sand into the intricate medieval fortress.

Why, then, do eggs stay cracked and medieval castles not spontaneously form in the desert? Why does something perfectly reasonable at the microscopic state, seem unreasonable at the macroscopic state. To understand this, we need to understand the often misrepresented idea of entropy.


At some point in your life, someone might have tried to sell you on the idea that entropy is the measure of disorder or randomness in a system. A deck of cards arranged from smallest to largest has less entropy than a deck of cards that is shuffled. In reality, both states are just two arbitrary arrangements out of the 52! equally possible arrangements.

What entropy really is concerned with, is the number of possible arrangements in a system that maintains the structure of that system. In the deck of cards, the number of arrangements, and thus entropy, stays constant. Consider that sand castle again. How many ways can we rearrange every grain of sand and maintain the structure of the castle? Probably a lot, but now consider the pile of sand that the wind turns it into. The number of ways you can rearrange those grains of sand and maintain that structure is exponentially greater than that of the castle.

The same is true for the egg. There are more ways to arrange egg atoms in a blob and retain the blob shape, than there are to arrange egg atoms in a shell and maintain the shell shape. As soon as you move one shell atom inside the egg, the structure would fall apart. The blob would have high entropy (many arrangements) and the whole egg would have low entropy (fewer arrangements).

Now we have a working definition of entropy, but we still haven’t answered why broken eggs don’t spontaneously rearrange themselves.


Let’s get more sciencey and consider a solid object made up of 3 atoms which we will add heat to. When you heat a solid, you are just adding energy to it. At the quantum level, energy is distributed in discrete chunks or packets. Let’s say that we add 4 packets of energy to our solid. How many ways can those energy packets be distributed across our 3 atoms? The first atom could get all four packets, it could get none, it could get one, and so on for the rest. There are 15 possible arrangements. (n+r-1) C (r-1).

Now let’s add a new solid, also made up of 3 atoms, next to your original solid that had been heated up. Energy can be freely exchanged between the two solids. If energy flows randomly between the two solids, and the 6 atoms making them up, how many arrangements are there now? There are 126 arrangements, but the real interesting thing is the distribution of those arrangements:

Ways to arrange 4 energy packets between two 3 atom solids.

# of arrangements where solid A has 0 energy packets: 15
# of arrangements where solid A has 1 energy packets: 30
# of arrangements where solid A has 2 energy packets: 36
# of arrangements where solid A has 3 energy packets: 30
# of arrangements where solid A has 4 energy packets: 15

The most frequent distribution is the one where both solids have an even number of energy packets. If all the energy packets are in solid A, there are 15 arrangements (lower entropy state). If the energy packets are spread between solid A and solid B, there are 126 arrangements (higher entropy state). And finally, probability demands that the higher entropy arrangement happen more often.

However, at this small-scale, there is still a 15/126 chance that the system will randomly settle at the lower entropy state. In our egg example, this would be the same as looking at those atoms under a microscope and seeing them flowing backwards. Entropy doesn’t always increase at the microscopic scale and experiments have backed this up. But when we scale up the example, the power of probability becomes overwhelming.

Consider two objects with more atoms, let’s say 50 in each. If we share 50 packets of energy between them, the odds of finding all the energy in one solid is about 1 in 133 billion. As you can imagine, an egg has many more atoms than 50.

Law of statistics

So there you have it, the egg doesn’t stay broken because of any law of physics that says it has to, but because the sheer power of statistics – which says there are many more ways for the egg to spread out than there are for it to be contained.

In other words, there are more ways for energy to spread around than are for energy to stay confined. It is perfectly possible to move in the direction of decreasing entropy, but the odds of it happening are so small at large scales that it can be considered impossible. It’s just math.

Heat death of the Universe

Taking this idea and extrapolating it into the future, all low entropy states (think stars) will converge to a high entropy state (think void). life is sustained by taking low entropy energy from the sun and converting it into work and heat. When peak entropy arrives, this process will end and everything will cease to be. Cosmologists estimate this will happen in about 10^100 years, so we have time to cope.


We worked out why if a cracked egg rearranging itself seems possible at the micro scale, does it seem impossible at the macro scale. The answer was that there is no law saying it can’t, but as objects get larger, there is just an overwhelming number of ways to spread out energy, than there is to keep it contained. There are more ways to arrange the atoms in a broken egg, than there are to arrange the atoms in a whole egg. Probability wins out.



Author: David Shahrestani

"I have the strength of a bear, that has the strength of TWO bears."

4 thoughts on “Natural Science #10: Entropy”

      1. That gets you an extra star for doing so well! You might look into the books by Dr. Peter Atkins. They are brief, except for his texts on physical chemistry and the like. If you’re interested in the history of science, you might like The Scientists by John Gribbin, an astrophysicist who writes quick well. Kind regards, MSOC

        Liked by 1 person

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s