Another Look at Entropy
As of the 23rd May 2022 this website is archived and will receive no further updates.
understandinguncertainty.org was produced by the Winton programme for the public understanding of risk based in the Statistical Laboratory in the University of Cambridge. The aim was to help improve the way that uncertainty and risk are discussed in society, and show how probability and statistics can be both useful and entertaining.
Many of the animations were produced using Flash and will no longer work.
Entropy is a term that draws both fear and reverence from the greatest physicists and mathematicians. How do you describe it? What does it even mean? Who in their right mind would want to quantify a phantom concept that's impossible to see or touch?
As kids we're usually stunned when we learn that Santa Claus doesn't really exist or that you can't actually bounce around on fluffy white clouds. In college, scientists and engineers deal with a similar epiphany when they learn that entropy is the reason that reactions happen, the reason ice melts, the reason that people have to rake up leaves every autumn. Entropy is even said to be responsible for human life.
Every fall when the leaves change colour and spill from the trees, they do so randomly. Leaves don't fall into neat piles or stack nicely into towers, they just fall. Similarly, when you drop a deck of cards onto the floor they don't arrange themselves by suit or by number (though that would be a nifty trick). You can't throw a broken egg at the wall and cause it to come back together into its original form, just like your office desk is bound to get messier and messier if you never clean it up. So what gives? Why does your desk always get dirty, you ask?
Entropy is a tendency for systems to move towards disorder and a quantification of that disorder. The reason a deck of cards doesn't reorganize itself when you drop it is because it's naturally easier for it to remain unordered. Think about the energy that it takes to arrange cards by value and suit: you've got to look at a card, compare it to others, classify it and then arrange it. You've got to repeat this process over and over until all 52 of the cards have been compared and arranged, and that demands a lot of energy. In fact, the cards fall to the floor in the first place because they would naturally rather go with gravity than oppose it. Imagine if you dropped your cards and they went flying at the ceiling -- that just doesn't make any sense.
A long time ago scientists (or one in particular, Rudolf Clausis) began to recognize this natural trend towards lower energy and sought to quantify it, sparking the idea of entropy. Entropy explained why heat flowed from warm objects to cold ones. It explained why balloons popped when filled with too much air (at least qualitatively) and it paved the way towards a more sophisticated understanding of everything from balancing a needle upright to describing why proteins fold the very specific ways they do. Entropy gave all of science's processes a veritable direction.
Today's modern understanding of entropy is twofold. On one hand, it's a macroscopic idea that describes things like falling leaves. On the microscopic level, however, entropy is highly statistical and is rooted in the principles of uncertainty. Gaseous substances, for instance, have atoms or molecules that zoom around freely in whatever space they occupy. If you could see gas in a box, you'd observe tiny atoms bouncing erratically from wall to wall, occasionally colliding with each other and changing direction accordingly. If you record the temperature and pressure of this system you have thus also effectively measured its macroscopic entropy - if the gas's temperature is very high, its molecules are zooming around so chaotically that its entropy, which quantifies this chaos, would be extremely high as well.
Our single box of gas might contain more than five hundred million tiny particles, though. So while it's great that we can say something about the average of the gas's observable properties, there is something important for scientists to gain from speculating how the microscopic states of the molecules are connected to these macroscopic observations. This bridge has called for a somewhat finer but more absolute definition of entropy from which all other mathematical expressions involving the term can be defined.
Depending on the type of gas you have whizzing around in your box, the energy it contains can be distributed in different ways. For example, lots of the molecules could be spinning rapidly but moving at a very slow speed. On the other hand, the molecules could be vibrating intensely and moving faster than an airplane without any rotational momentum. Statistically, this variance in the distribution of energy in our gas can be captured in the concept of a microstate. In one microstate most of the energy will be rotational, while in another it might be all in the velocity of the molecules. Thermodynamics usually makes the assumption that the probability of the gas being in any one of these microstates is equal, the so-called a priori probability postulate. This leads to the following equation:
$$S = k_B ln \Omega$$
where $S$, the statistical entropy of the system, is equal to the Boltzmann constant (another story for another time) times the natural logarithm of the number of microstates in the system, $\Omega$. As the number of microstates increases the information we know about energy distribution decreases, meaning that the system's entropy, its chaos, skyrockets. This is the most fundamental and absolute definition of entropy.
When you shuffle a deck of cards you are basically maximizing the entropy of that system, since you know absolutely nothing about the order of the numbers or the suits. Each possibility is a microstate in this case, and each ordering of the 52 cards has an equal probability of occurring. When you arrange the deck by number and suit, you lower the entropy of the system by increasing the amount of information you know about it.
The second law of thermodynamics qualitatively explains nature's tendency to move towards lower energy. Ice cubes don't just form from a glass of hot water; broken eggs don't spontaneously regenerate into whole eggs; your office desk isn't going to clean itself. The mathematical idea of entropy can be extracted from this principle. If you put an ice cube into a piping hot bowl of water, what happens? The ice melts, of course. But in a deeper sense, ice is a very ordered solid object, which means that as a whole it has very low entropy. By absorbing the heat from the hot water, the molecules inside the ice cube break loose and are able to move more freely as a liquid - their randomness increases, and so does their entropy. Heat flows from hot objects to colder ones because of this idea of equilibrium. All things in nature tend towards higher entropy, which suggests that the entropy of the universe must also be continuously increasing.
Quantum thermodynamics attempts to link micro-scale states to macroscopic observations and is ultimately able to deduce a statistical description of entropy from this marriage. The guiding idea here is that molecular motion is responsible for major thermodynamic occurrences. When a substance heats up, its molecules move faster and faster, and that movement is all the more randomized at high speeds. When a substance cools down, its molecules move very slowly and are unnaturally constrained.
Entropy is intimately related to temperature for this reason - and temperature is one of the most basic thermodynamic properties. As it turns out, pressure - a very important thermodynamic concept as well - is also a result of molecular motion. A statistical approach to thermodynamics allows us to figure out very delicate information about systems, especially ones that involve chemicals. By using statistical thermodynamics, for instance, one can more accurately model chemical equilibrium in systems that may have multiple competing reactions. Entropy has guided biologists to formulate a thermodynamic model of protein folding, which shows that when a protein collapses into its characteristic globule form, its energy goes down. This explains why a seemingly unordered organic molecule appears to actually become more ordered by collapsing, which violates the idea of entropy. Just like a pin cannot stay balanced on its needlepoint nose, proteins cannot stay unfolded in the aqueous solutions of our bodies without eventually forming into folded structures.
The consequences of this statistical approach to thermodynamics are profound in that they defy the deterministic ideals of physics. Classically, if you throw a baseball into a field you have a pretty good idea of what's going to happen to it: the ball will follow a trajectory that's well understood and precisely modeled. The shape of that trajectory is determined by various inputs like how hard you throw the object and at what angle it gets released from your hand. A deterministic view of the world is one that relies on the principle observation that physical systems behave the same way each time you give them identical outputs. Throw a ball at the same angle and with the same force and it's very likely that your result will be the same each time you execute the action. Put a brick wall in front of a fast-moving car and?well, you get the picture.
When Newton developed his groundbreaking laws of motion, he also introduced a new way of thinking about the world. This deterministic ideology permeated science for a long time. Apples fall from trees because of gravity, and for that matter, all objects will fall to the ground because resistance to gravity is unnatural. Nobody jumps off a cliff and flies up to the clouds (unless you're watching the X-Files, maybe). This idea of a determined set of rules that govern our actions eventually coagulated into an entire philosophy whose tenets follow similar guidelines.
The development and proliferation of quantum mechanics challenged the deterministic model of classical physics. It suggested that instead of being determined, the outcome of any physical system's input was actually biased and changed by outside measurement. It suggested that at the microscopic level, physical systems have variable outcomes given the same set of inputs, and it also argued that this variability could describe macroscopic phenomena. This is a probabilistic way of thinking about the world and pushes the idea that a system exists in infinitely many states simultaneously until it is measured.
This argument was initially excused as absurd because it introduced an inherent uncertainty to micro-systems. It told physicists - 'Hey, you can never know a particle's position and its momentum at the same time!' Over time people began to realize that at the microscopic level, stochastic models explained things like superconductivity and even nuclear reactions in ways that classical mechanics simply could not. Eventually quantum mechanics came to be considered an extremely important branch of physics, and scientists and philosophers alike have learned to accept the uncertainty in a probabilistic view of macroscopic actions.
So what does this say about the person who thinks the world's events are determined? Unfortunately, both determinism and probabilistic perspectives of the universe complement each other in about as many ways as they diverge, so it's hard to say who is right. At the very least, quantum mechanics offers an incredible insight into probabilistic outcomes that has changed the way we think about constraints and uncertainty in stochastic systems.
Of all the mysterious road bumps in the long history of entropy's development, the only thing that happens to be certain is its uncertainty. Knowing how entropy affects you can help you appreciate and see the world from a different perspective - one that isn't afraid to dream big and small, that isn't afraid to challenge the norms.
One that isn't afraid to innovate.
Author: Jeremy Fordham is an engineer who enjoys and encourages discussion at the boundaries of many different disciplines. He is a proponent of renewable energy and distance learning, and contributes as a writer to resources promoting Ph.D programs.
- Log in to post comments
Comments
Mark (not verified)
Fri, 01/07/2011 - 10:36am
Permalink
Infinite Improbability Drive
yes they do :-) its just very very unlikely!
MaynardM (not verified)
Sat, 16/07/2011 - 2:37am
Permalink
Who says something is disordered?
Hugo Nava Kopp (not verified)
Thu, 03/11/2011 - 4:43pm
Permalink
There is an "absolute judgment"
karateusatm
Wed, 02/01/2013 - 4:17pm
Permalink
About Judgement of Order and Observation
Daniel (not verified)
Tue, 19/07/2011 - 6:48am
Permalink
entropy applied to the game of 20 questions
Peter Naylor (not verified)
Sat, 14/01/2012 - 10:04am
Permalink
Order, Entropy?
dagelf
Tue, 24/07/2012 - 9:46am
Permalink
Two passing perspectives
jeremy fordham
Fri, 27/07/2012 - 1:35am
Permalink
How do you define, or measure
malthusista
Mon, 01/10/2012 - 6:27am
Permalink
Solar system, evolution, seems contrary to the Second Law
Dave Marsay
Thu, 01/11/2012 - 1:41pm
Permalink
Entropy is relative