Tuesday, December 6, 2016

Entropy—the sins of society passed down to science

Arthur Eddington*, physicist (mainly astrophysics), mathematician, philosopher of science, said,
If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations—then so much the worse for Maxwell’s equations. If it is found to be contradicted by observation—well these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.
The second law of thermal dynamics, a forbidding phrase to say the least, is better known to this world as entropy, making “collapses in deepest humiliation” even more appropriate. For that is how the world views entropy—the relentless but inexorable flow from order to chaos, the slow but steady process of energy dissipation, and the inevitability of final collapse. We believe the simple truth that entropy forces us to accept the dire sense that, in the long run, everything winds down to dissolution, decay, and doom.
“This is the way the world ends
Not with a bang but a whimper.”

But, hey, that’s the law. Just look around. Hot pans cool down. Raging rivers flow to calm oceans, our winter fuel bills soar as we practically feel the heat dissipate through entropic cracks, stars burn out, windows break sending useful order into hopelessly scattered fragments of chaos, but never the reverse.

Of course, we also see pounding rain and swirling wind arise out of calm oceans and skies, cold matter collapse into itself igniting violent explosions, exotically detailed flowers spring from inert, sluggish mud, trees spread delicate leaves in greater number than those that fell chaotically the year before, or the frantic use of fossil fuels which have been slowly and steadily converging over the millenniums.

What’s happening? Isn’t there also the law of the conservation of energy? Aren’t there a lot of contradictions here? Well, I’m a big believer in the law of contradictions (which I hope, some day, to formulate), but, as you might suspect, when people begin to weigh dissolution, decay, and doom against entropy, a lot of crazy facts and beliefs spew forth. And it turns out that just about everything we think we know about entropy is wrong. It’s almost as if thinking about entropy has dissipated our brains to listless mush. There is so much wrong with our notions of entropy, that it may be the poster child for the creeping mistrust of science.

Entropy is, indeed, fairly simple and useful, but it may be more helpful to first report its misconceptions. I will quote extensively from Frank L. Lambert, Professor Emeritus (Chemistry), Occidental College from an article "Entropy is simple — If We Avoid the Briar Patches!".

•Well meaning muddle
Unfortunately, the ideas of entropy and the second law have been almost hopelessly muddled by well-meaning but scientifically naïve philosophers and writers of both fiction and non-fiction.
•Information Entropy
An additional source of confusion to anyone outside of chemistry or physics is due to brilliant but thoughtless mathematicians. They decided that it would be amusing to name a very important new mathematical function communication "entropy" because "no one knows what [thermodynamic] entropy really is, so in a debate you will always have the advantage". (That quote is of John von Neumann speaking to Claude Shannon (Sci. Am. 1971 , 225 , 180.) For the past half-century this has turned out to be a cruel practical joke imposed on generations of scientists and non-scientists because many authors have completely mixed up information "entropy" and thermodynamic entropy. They are not the same!
•Entropy is NOT deterministic
When we open the oven door and heat dissipates, it does so, not in a required sense, but because it is more probable. Air molecules are constantly moving. When the oven door opens, it’s just more probable that some move out of the oven into the room. Consider 3 distinct air molecules on a 2x2 chess board. There are only 24 different possible combinations of positions for the molecules in that small space.


If we increase the size of the chess board to 4x4, the different possible combinations of positions rise dramatically. By my count, 14*14*14 or 2744. 


Even when only slightly increasing 2-dimensional space, it is extremely unlikely (less than 1%) you would find the 3 molecules in one of the 24 original configurations. That is why you will never find the hot air molecules remaining confined to the oven when the door is opened. It is just so unlikely in that bigger space to the point of “will never happen”.

•Isolated Systems
Most misleading to understanding entropy is when it is discussed in isolated systems. These theoretical systems are not only useless to a beginner but what happens in them can profoundly confuse anyone trying to understand entropy and the second law in the real world. We humans live in an open system of earth, sun, and outer space. We encounter the second law and entropy within that open system. Therefore, the energy-entropy relationships that are useful for us to examine are in that real system.
•Entropy is NOT disorder!
This confusion about disorder and entropy comes from an 1898 statement by a brilliant theoretical physicist whose mathematical contributions to thermodynamics and entropy are still totally valid. However, his attempt to interpret entropy in simple language was incorrect because only after his death in 1906 came an understanding of molecular behavior. Order/disorder became increasingly obsolete to apply to entropy and the second law when the existence of quantized energy levels in physics and chemistry was generally accepted after the mid-1920s.
Although order/disorder is still present in some elementary chemistry texts as a gimmick for guessing about entropy changes (and useful to experts in some areas of thermodynamics), it is both misleading and an anachronism for beginners in chemistry. It has been deleted from most first-year university chemistry textbooks in the US.
Now that we have cleared the air, we can say what entropy is.
The second law of thermodynamics says that energy of all kinds in our material world disperses or spreads out if it is not hindered from doing so. Entropy is the quantitative measure of that kind of spontaneous process: how much energy has flowed from being localized to becoming more widely spread out (at a specific temperature).
It is the measure of hot pans cooling—how much energy is being transferred. It is a measure of cream circulating in coffee—how widely molecules disperse. Notice that included in the definition is the phrase “if it is not hindered from doing so”. That is why the molecules in our bodies don’t disperse or why our carbon atoms don’t explode when exposed to the air. Without our second law of thermodynamics and its measure, entropy, life could not count on a steady flow of energy and, indeed, all would be chaos.


*Note that Eddington is often, apparently incorrectly, cited to have said, “Not only is the universe stranger than we imagine, it is stranger than we can imagine.” There is no record of him ever writing that. Myk correctly, cites J. B. S. Haldane for having said, “The universe is not only queerer than we suppose, but queerer than we can suppose.”

1 comment:

Big Myk said...

I must admit: I was one of the uniformed that thought that entropy was the tendency of the universe to become disordered. So, now I am enlightened.

I think that the misunderstanding of entropy can be explained in a poem written by the great physics teachers, Mark Zemansky:

Teaching thermal physics
is as easy as a song:
You think you make it simpler
when you make it slightly wrong.


Coincidentally, I am now reading a book called The Most Powerful Idea in the World: A Story of Steam, Industry & Invention. Right now, I am learning about James Watt and his improvements to the steam engine. One thing that Watt spent a lot of time on was devising ways for obstructing the second law of thermodynamics to make the steam engine more efficient. In the existing steam engine, the cylinder that housed the piston had no covering to reduce heat loss. Watt not only cased the cylinder in non-conducting material, such as wood, but he also added a layer of steam, between the cylinder proper and an outer shell.

You may be interested that his greatest innovation was to to condense the steam in a chamber separate from the cylinder that housed the piston. The steam engine, as it existed, created a vacuum by cooling steam. That vacuum pulled the cylinder down (from air pressure) and drove the engine. The engine, however, was inefficient because it cooled the steam in the same cylinder that was heated to create the steam. This ended up using a lot of extra energy. Watt solved the problem by arranging for the condensation to occur in a chamber separate from the cylinder but connected to it. Steam entered the cylinder and pushed the piston upward. That also opened a valve that connected the cylinder to the condenser chamber, drawing the steam down into the chamber and condensing it to water thereby creating the vacuum. Now, the cylinder no longer needed to be heated from a sub-212F temperature.