Another Look at Entropy

As of the 23rd May 2022 this website is archived and will receive no further updates. was produced by the Winton programme for the public understanding of risk based in the Statistical Laboratory in the University of Cambridge. The aim was to help improve the way that uncertainty and risk are discussed in society, and show how probability and statistics can be both useful and entertaining.

Many of the animations were produced using Flash and will no longer work.

Entropy is a term that draws both fear and reverence from the greatest physicists and mathematicians. How do you describe it? What does it even mean? Who in their right mind would want to quantify a phantom concept that's impossible to see or touch?

As kids we're usually stunned when we learn that Santa Claus doesn't really exist or that you can't actually bounce around on fluffy white clouds. In college, scientists and engineers deal with a similar epiphany when they learn that entropy is the reason that reactions happen, the reason ice melts, the reason that people have to rake up leaves every autumn. Entropy is even said to be responsible for human life.

Every fall when the leaves change colour and spill from the trees, they do so randomly. Leaves don't fall into neat piles or stack nicely into towers, they just fall. Similarly, when you drop a deck of cards onto the floor they don't arrange themselves by suit or by number (though that would be a nifty trick). You can't throw a broken egg at the wall and cause it to come back together into its original form, just like your office desk is bound to get messier and messier if you never clean it up. So what gives? Why does your desk always get dirty, you ask?

Entropy is a tendency for systems to move towards disorder and a quantification of that disorder. The reason a deck of cards doesn't reorganize itself when you drop it is because it's naturally easier for it to remain unordered. Think about the energy that it takes to arrange cards by value and suit: you've got to look at a card, compare it to others, classify it and then arrange it. You've got to repeat this process over and over until all 52 of the cards have been compared and arranged, and that demands a lot of energy. In fact, the cards fall to the floor in the first place because they would naturally rather go with gravity than oppose it. Imagine if you dropped your cards and they went flying at the ceiling -- that just doesn't make any sense.

Rudokf Clausis A long time ago scientists (or one in particular, Rudolf Clausis) began to recognize this natural trend towards lower energy and sought to quantify it, sparking the idea of entropy. Entropy explained why heat flowed from warm objects to cold ones. It explained why balloons popped when filled with too much air (at least qualitatively) and it paved the way towards a more sophisticated understanding of everything from balancing a needle upright to describing why proteins fold the very specific ways they do. Entropy gave all of science's processes a veritable direction.

Today's modern understanding of entropy is twofold. On one hand, it's a macroscopic idea that describes things like falling leaves. On the microscopic level, however, entropy is highly statistical and is rooted in the principles of uncertainty. Gaseous substances, for instance, have atoms or molecules that zoom around freely in whatever space they occupy. If you could see gas in a box, you'd observe tiny atoms bouncing erratically from wall to wall, occasionally colliding with each other and changing direction accordingly. If you record the temperature and pressure of this system you have thus also effectively measured its macroscopic entropy - if the gas's temperature is very high, its molecules are zooming around so chaotically that its entropy, which quantifies this chaos, would be extremely high as well.

Our single box of gas might contain more than five hundred million tiny particles, though. So while it's great that we can say something about the average of the gas's observable properties, there is something important for scientists to gain from speculating how the microscopic states of the molecules are connected to these macroscopic observations. This bridge has called for a somewhat finer but more absolute definition of entropy from which all other mathematical expressions involving the term can be defined.

Depending on the type of gas you have whizzing around in your box, the energy it contains can be distributed in different ways. For example, lots of the molecules could be spinning rapidly but moving at a very slow speed. On the other hand, the molecules could be vibrating intensely and moving faster than an airplane without any rotational momentum. Statistically, this variance in the distribution of energy in our gas can be captured in the concept of a microstate. In one microstate most of the energy will be rotational, while in another it might be all in the velocity of the molecules. Thermodynamics usually makes the assumption that the probability of the gas being in any one of these microstates is equal, the so-called a priori probability postulate. This leads to the following equation:

$$S = k_B ln \Omega$$

where $S$, the statistical entropy of the system, is equal to the Boltzmann constant (another story for another time) times the natural logarithm of the number of microstates in the system, $\Omega$. As the number of microstates increases the information we know about energy distribution decreases, meaning that the system's entropy, its chaos, skyrockets. This is the most fundamental and absolute definition of entropy.

When you shuffle a deck of cards you are basically maximizing the entropy of that system, since you know absolutely nothing about the order of the numbers or the suits. Each possibility is a microstate in this case, and each ordering of the 52 cards has an equal probability of occurring. When you arrange the deck by number and suit, you lower the entropy of the system by increasing the amount of information you know about it.

The second law of thermodynamics qualitatively explains nature's tendency to move towards lower energy. Ice cubes don't just form from a glass of hot water; broken eggs don't spontaneously regenerate into whole eggs; your office desk isn't going to clean itself. The mathematical idea of entropy can be extracted from this principle. If you put an ice cube into a piping hot bowl of water, what happens? The ice melts, of course. But in a deeper sense, ice is a very ordered solid object, which means that as a whole it has very low entropy. By absorbing the heat from the hot water, the molecules inside the ice cube break loose and are able to move more freely as a liquid - their randomness increases, and so does their entropy. Heat flows from hot objects to colder ones because of this idea of equilibrium. All things in nature tend towards higher entropy, which suggests that the entropy of the universe must also be continuously increasing.

Quantum thermodynamics attempts to link micro-scale states to macroscopic observations and is ultimately able to deduce a statistical description of entropy from this marriage. The guiding idea here is that molecular motion is responsible for major thermodynamic occurrences. When a substance heats up, its molecules move faster and faster, and that movement is all the more randomized at high speeds. When a substance cools down, its molecules move very slowly and are unnaturally constrained.

Entropy is intimately related to temperature for this reason - and temperature is one of the most basic thermodynamic properties. As it turns out, pressure - a very important thermodynamic concept as well - is also a result of molecular motion. A statistical approach to thermodynamics allows us to figure out very delicate information about systems, especially ones that involve chemicals. By using statistical thermodynamics, for instance, one can more accurately model chemical equilibrium in systems that may have multiple competing reactions. Entropy has guided biologists to formulate a thermodynamic model of protein folding, which shows that when a protein collapses into its characteristic globule form, its energy goes down. This explains why a seemingly unordered organic molecule appears to actually become more ordered by collapsing, which violates the idea of entropy. Just like a pin cannot stay balanced on its needlepoint nose, proteins cannot stay unfolded in the aqueous solutions of our bodies without eventually forming into folded structures.

The consequences of this statistical approach to thermodynamics are profound in that they defy the deterministic ideals of physics. Classically, if you throw a baseball into a field you have a pretty good idea of what's going to happen to it: the ball will follow a trajectory that's well understood and precisely modeled. The shape of that trajectory is determined by various inputs like how hard you throw the object and at what angle it gets released from your hand. A deterministic view of the world is one that relies on the principle observation that physical systems behave the same way each time you give them identical outputs. Throw a ball at the same angle and with the same force and it's very likely that your result will be the same each time you execute the action. Put a brick wall in front of a fast-moving car and?well, you get the picture.

When Newton developed his groundbreaking laws of motion, he also introduced a new way of thinking about the world. This deterministic ideology permeated science for a long time. Apples fall from trees because of gravity, and for that matter, all objects will fall to the ground because resistance to gravity is unnatural. Nobody jumps off a cliff and flies up to the clouds (unless you're watching the X-Files, maybe). This idea of a determined set of rules that govern our actions eventually coagulated into an entire philosophy whose tenets follow similar guidelines.

The development and proliferation of quantum mechanics challenged the deterministic model of classical physics. It suggested that instead of being determined, the outcome of any physical system's input was actually biased and changed by outside measurement. It suggested that at the microscopic level, physical systems have variable outcomes given the same set of inputs, and it also argued that this variability could describe macroscopic phenomena. This is a probabilistic way of thinking about the world and pushes the idea that a system exists in infinitely many states simultaneously until it is measured.

This argument was initially excused as absurd because it introduced an inherent uncertainty to micro-systems. It told physicists - 'Hey, you can never know a particle's position and its momentum at the same time!' Over time people began to realize that at the microscopic level, stochastic models explained things like superconductivity and even nuclear reactions in ways that classical mechanics simply could not. Eventually quantum mechanics came to be considered an extremely important branch of physics, and scientists and philosophers alike have learned to accept the uncertainty in a probabilistic view of macroscopic actions.

So what does this say about the person who thinks the world's events are determined? Unfortunately, both determinism and probabilistic perspectives of the universe complement each other in about as many ways as they diverge, so it's hard to say who is right. At the very least, quantum mechanics offers an incredible insight into probabilistic outcomes that has changed the way we think about constraints and uncertainty in stochastic systems.

Of all the mysterious road bumps in the long history of entropy's development, the only thing that happens to be certain is its uncertainty. Knowing how entropy affects you can help you appreciate and see the world from a different perspective - one that isn't afraid to dream big and small, that isn't afraid to challenge the norms.

One that isn't afraid to innovate.

Author: Jeremy Fordham is an engineer who enjoys and encourages discussion at the boundaries of many different disciplines. He is a proponent of renewable energy and distance learning, and contributes as a writer to resources promoting Ph.D programs.


Leaves don't fall into neat piles or stack nicely into towers, they just fall. Similarly, when you drop a deck of cards onto the floor they don't arrange themselves by suit or by number.
    yes they do :-) its just very very unlikely!

A difficulty I have with entropy is that it seems to refer to the amount of disorder in a system. But isn't disorder a human construct? My desk may appear disordered to humans because things aren't in categorized piles. But is it possible that some other non-human life form might find it very ordered? Or does the amount of disorder depend on the existence and judgement of a life form at all?

MaynardM: There is an absoult concept of what "Order" is. The Order (with capital O) of the system is related to how compressed the information is describing the system. So if all your papers are in one neat pile, then the description is "All papers are in one neat pile". But if they are all over the place, then the description becomes "One paper is to the right, another to the left, then another one on top of the first one ..." and so forth. The longer the description, the higher the entropy. That should give you a first insight, although to be precise, the answer should be much longer. I can give it to you. I am doing PhD research on the topic, I would be glad to have a chat if you want.

Looking at the probability and judgement of entropy If I Have organized my home and removed the appearance of clutter in a confined space has the chaos been removed? My home looks nice and neat to the observer however when I attempt to locate something ( say a piece of paper) I won't know where to look. If how ever I left everything where is was I could find it quickly Does this judgement relate to true entropy and is subject to time direction? Moving from clutter to order Do we have the power to control order and thereby influence probability.

I wrote an essay applying entropy to the game of 20 questions. For interested readers, it is here:

Two questions/problems arise out of this: why is the system of activity which is the foundation of matter visible to us and secondly, why does this activity take ordered forms? To answer the first question, we have to postulate a principle of opposition and since this is not immediately discoverable by means of science, it becomes probable that it operates in a medium which while linked with matter is not yet within the scope of scientific discovery. We know this medium as the atomic activity upon which matter is based. In answer to the second question, we assert that this order is imposed upon chaotic strife by the Will and Purpose of God who is invoked as the Principle of Order and Harmony and described in human terms as Creative Love. For there is no justification for a priori assumption of the necessity for the rhythm and order as found in nature. Why order, unless the guiding principle of the universe is of this nature itself? It is time that we realise that Creation and Evolution go hand in hand. One is the guiding principle of form (matter) the other is the guiding principle of Life. As Higgs Boson looms ever nearer, it is becoming increasingly difficult not to include the missing X factor to all that remains a mystery in this amazing world of ours.

Assuming Plato's world of forms is real, and Godel's theorem of incompleteness proves it... Perhaps time = complexity = entropy = a much harder-to-see order, but more ordered. Also, the conventional act of measuring = the act of taking from that world and bringing it into this. A last thought: Mathematics is a subset of algorithms and networks - not vice versa.

How do you define, or measure, or even concept a harder-to-see-order without changing the nature of the order you're trying to define in the first place? I think that to assume mathematics is a subset of algorithms is to also assume that the Form of life is somehow an algorithmic blueprint, which is a big leap. Still, a very interesting thought experiment. :)

That ever more complex life forms have evolved over hundreds of millions of years on Earth would appear to be contrary to the Second Law of Thermodynamics. However, the increasing order of life on Earth and also of human society (assuming our 'leaders' don't start an all-out war against Syria, Iran and Russia soon) appears to be contrary to the Second Law which states that entropy must increase over time. Of course, the increase in order in the one small piece of the Solar System that comprises Earth and its satellites is minuscule in comparison to the accompanying increase in entropy resulting from the conversion of mass from protons and neutrons in the core of the Sun through nuclear fusion into radiated Solar Energy. How can these respective amounts of order and entropy be quantified?

The entropy of gas in a box may be very high, but with respect to the solar system it is very low. Sheep-dogs often decrease the entropy of sheep, by taking them off hills and putting them in to pens. So entropy is relative to constraints, and so is the second law. To understand entropy fully, we need to understand those constraints. For example, entropy is sometimes defined in terms of probability functions, which we often treat as absolute. But actually they are (normally) conditional, and we may be able to change those conditions. So the second law is not the last word, and nor is Bayes' rule. (IMHO)