All About Entropy… – Part I – Beginnings

patreon

If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations – then much the worse for Maxwell’s equations. If it is found to be contradicted by observation—well, these experimentalists do bungle things sometimes. But if your theory is found to be against the Second Law of Thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation… – Sir Arthur Eddington

Entropy is a subject that fascinates me the most as it is rooted in thermodynamics, biology, information theory, the formation galaxys and the big bang, “the arrow of time” and much much more…
In the following set of posts I shall try to touch upon those many roots.

I am not in any way world renowned expert on the subject (far from it…😉 ), so no post will by no means be all encompassing or highly accurate (you can get that from very in-depth academic resources…), nor it is intended to be such…

I therefore have no thought that this blog might be a benefit to all others, I wrote it only to sustain my understanding.

I hope for the essence to be captured, and even more so communicated…✨✨✨

 

The principle underlying irreversible processes is summed up in a few words to be named the Second Law of Thermodynamics:

The entropy of an isolated system either remains constant or increases with time

The development of the second’s law of thermodynamics is kind of an odd one. All though our modern definition of entropy was proposed by Austrian physicist Ludwig Boltzmann in 1887. But its actual use dates even further to the past to the mighty German physicist Rudolf Clausius.

Even more interesting that is that the second law itself goes back even earlier… French military engineer Sadi Carnot in 1824. Carnot, a little bit pride wounded by the reality of the English being ahead of the french in steam engine technology, took the task of finding out how efficient can such a steam engine possibly be, in other words: how much useful work could one do by burning a certain amount of fuel.

The basic operation of the steam engine is quite simple: steam expands when heated and so pushes outward. A steam engine harnesses this action by heating a vessel filled with steam that is capped by a fitting piston free to slide up and down along the vessel’s inner surface. As the heated steam expands, it pushes against the piston, and the outward thrust can drive a mechanical object such as a wheel to turn. Then, having expended energy through this outward movement, the steam cools and the piston slides back to its initial position, where it stands ready to be pushed when the steam is heated again. Therefore a cycle that can repeat itself so long as there is burning fuel to heat the steam.
It took Carnot an intellectual leap from real machines to “idealized heat engines” to demonstrate that by minimizing the amount of wasted heat there was a best possible engine which got the most work for a given amount of fuel while working at a given temperature. Carnot actually figured out that even the most efficient engine possible is not perfect, and some energy is turns to waste energy along the process.

carnot
Ideal Carnot Engine (p-V diagram)

It was German physicist Rudolf Clausius that understood about 25 years later what was now about to be reflected as a law of nature:

Heat doesn’t spontaneously flow from cold bodies to warm ones

In 1850 Clausius reformulated this obvious quotation it took me a sentence to add in terms of a new quantity and called it “Entropy”.

Simply putting, it goes something like this: take an object gradually cool it down, it is emitting heat into its surroundings. As this process happens, we consider at every moment the amount of heat being lost, then divide it by the temperature of the object. The entropy is then the accumulated amount of this quantity, in other words: the heat lost divided by the temperature over the course of the entire process.

By doing that Clausius proved that the tendency of heat to flow from hot objects to cold ones was exactly equivalent to the claim that the entropy of a closed system would only ever go up. We may also define an equilibrium configuration as one in which the entropy has reached its maximum value. This is exactly the case where all the objects participating  by being in contact are at the same temperature.
Summing that up it could be simply said that the entropy measures the usefulness of a certain amount of  energy.

Clausius equality
Clausius equality

A little word of caution is in order though. The Second Law of Thermodynamics doesn’t mean that the entropy of a system can never decrease. We may if we want invent a machine that separates the coffee liker from the Vodka from the Kahlúa  and  the heavy cream in a “white Russian cocktail” after Big Lebowski stirs it… The trick is that we can only decrease the entropy of Big Lebowski’s white Russian by creating more entropy elsewhere.

“Careful man, there’s a beverage here!” — The Dude
“Careful man, there’s a beverage here!” — The Dude

The Statistical Era -Ludwig Boltzmann

Carnot’s and Clausius’s insight into this astounding law of nature was more of a phenomenological one .

The great leap in the understanding of entropy came only after the realization of atoms, and the birth and rise of “statistical mechanics”.
The concept of emergence (brought up in Some Fundamental Thoughts About Turbulence Modeling…) rises again, so you are revised to go back and read that former post, although for further writing sake I shall repeat some aspects of the Emergence Concept.

A description of emergence
A description of emergence

The most fundamental description of nature is referred to as “microscopic”, while there are separate rules that apply only to large systems referred to as “macroscopic”. The latter are Emergent Rules. The behavior of pressure for example can be understood in terms of atoms: it’s simply the average of motion of trillions of molecules that slam into a surface each second. But it can equally well be understood without knowing anything whatsoever by counting and measuring the banging of these trillion s of molecules: That’s the phenomenological approach known as “thermodynamics.” |
It is very common physics to treat complex, macroscopic systems, where regular patterns emerge from underlying microscopic rules. there is no competition between fundamental physics and the study of emergent phenomena; both are fascinating and crucially important to our understanding of nature – Isn’t turbulence as an emergent phenomena interesting?…

turbuence-car-ansys_gcygpo

Ludwig Boltzmann’s triumph in formulating a microscopic understanding of entropy through the of the kinetic theory

Boltzmann realized that when we look at some macroscopic system, we can’t keep track of the exact properties of every single atom. If we had  a glass of Big Lebowski’s white Russian in front of us, then somehow sneak while he’s stoned and switch some of the Big Lebowski’s white Russian molecules around without changing the overall temperature and density and so on, once he wakes up he would never notice (and not only because he was stoned… 😉). There are simply many different arrangements of particular atoms that are indistinguishable from Big Lebowski’s macroscopic perspective.
Then, Boltzmann noticed that low-entropy objects are more rare with respect to such rearrangements. In other words, Big Lebowski’s would (possibly) realize if he suddenly found his fully stirred white Russian separated back to its ingredients one on top each other…

So Boltzmann took the concept of entropy, which had been defined by Clausius and others as a measure of the uselessness of energy, and redefined it in terms of microscopic entities:

Entropy is a measure of the number of particular microscopic arrangements that appear indistinguishable from a macroscopic perspective.

Let’s try to explain that with a simple analogy:
Imagine you vigorously shake a bag containing a hundred coins and then dump them out on your floor. If you found that all hundred coins were heads, you’d surely be surprised. let’s think it through… not drawing even a single tail means each of the hundred coins, randomly flipping, must hit the floor and land heads up. All of them. whoooo… Getting that unique outcome is quite far fetch to say the least.
Now let us assume we had just a slightly different outcome, a single tail (and the other 99 coins still landing heads), there are a hundred different ways this can happen: the lone tail could be the first coin, it could be the second coin, the third, and so on up to the last coin. Meaning getting 99 heads seems to be a hundred times more likely than 100 heads.
a small calculation draws that there are 4,950 different ways we can get two tails, and with a little more calculating (e.g. calculator…) we find that there are 161,700 different ways to have three of the coins come up tails, then 4 million ways to have four tails, and about 75 million ways to have five tails. The likelihood sure goes up fast…

coins

 

On Ludwig Boltzmann’s grave in the Zentralfriedhof, Vienna, The inscription of the equation, S=k log W (log seems seems quite implied by the huge rise in likelihood we saw in the example), is his formula for entropy in terms of the number of ways you can rearrange microscopic components of a system without changing its macroscopic appearance.

boltzmann grave

 

line-1

This sums up the Part I of “All About Entropy…”. hope you enjoyed it… 🙌

In further pats we shall delve more deeply into the thermodynamic application of the law, a deep look at the Claude Shannon’s information theory interpretation of entropy as a basic quantity associated to any random variable, which can be interpreted as the average level of “information”, “surprise”, or “uncertainty” inherent in the variable’s possible outcomes. And following that touch on exceptional topics relating to entropy such as the formation of life, galaxies, and even “The Arrow of Time”.

Back to “All About CFD…” Index

 





msi-happy-computations
MSI Engineering Software – Mechanical Engineering Software Distributor (STAR-CCM+, FloEFD, FloTherm, MSC Nastran, etc…)
taz-happy-computations-2
TAZ-Engineering: Thermal Management and CFD Consultancy

“Careful man, there’s a beverage here!” — The Dude
“Careful man, there’s a beverage here!” — The Dude
Advertisements

2 thoughts on “All About Entropy… – Part I – Beginnings

  1. Pingback: All About CFD… – Index – Tomer's Blog – All About CFD…

Leave a Reply