HN2new | past | comments | ask | show | jobs | submitlogin
Demystifying the second law of thermodynamics (erischel.com)
55 points by akakievich on Nov 24, 2020 | hide | past | favorite | 38 comments


It’s probably been 15 years since I looked in detail at a formal treatment of statistical mechanics, so maybe this is well-trodden territory, but... has anyone formalized the idea that the second law can be interpreted as a general inability of computer programs to predict one another? I’m also channeling Wolfram here, but I’m not sure if he ever expressed it exactly this way. But the thing I’m imagining would be: set up some (preferably discrete) dynamical system, and then another different kind of system that you fit variationally to the first. The claim would be that the mutual information between state variables of the second and the first will always gradually decrease — unless they are the exactly the same family, in which case the fit rediscovers the exact dynamics and the two systems are identical. So intuitively, the second law would correspond to the claim that different families of programs cannot accurately simulate each other for very long. In the special case of a coarse graining, this should give you the more familiar story about entropy and the ‘gap’ between macrostates and microstates, but the real story is a more general one than that and is an empirical fact about the “computational universe”, with physics a special case.


Yeah! You can derive the second law from "conservation of information".

This is a great resource about thermodynamics which goes into some of these ideas: http://www.av8n.com/physics/thermo


Hey Adele! Thanks for the resource. Here's something a bit wilder I'm still trying to wrap my head around:

https://www.sciencedirect.com/science/article/abs/pii/S09252...


Sounds reminiscent of the data processing inequality https://en.wikipedia.org/wiki/Data_processing_inequality


>has anyone formalized the idea that the second law can be interpreted as a general inability of computer programs to predict one another?

Arieh Ben Naim thinks entropy is better described by Shannon's Measure of Information. I haven't actually read any of his books, but I've been meaning to "one of these days".

http://ariehbennaim.com/books/index.html


Frustrating, that list contains almost no usable links. Its as if he doesn’t want you to actually read or understand what he is saying.


  - Entropy Demystified: The Second Law Reduced To Plain Common Sense 
(https://www.amazon.com/Entropy-Demystified-Second-Reduced-Co...)

  - A Farewell to Entropy 
(https://www.amazon.com/Farewell-Entropy-Statistical-Thermody...)

  - Discover Entropy and the Second Law 
(https://www.amazon.com/Discover-Entropy-Second-Law-Thermodyn...)

  - Entropy And The Second Law: Interpretation And Misss-Interpretationsss 
(https://www.amazon.com/Entropy-Second-Law-Interpretation-Mis...)

  - Information, Entropy, Life and the Universe 
(https://www.amazon.com/Information-Entropy-Life-Universe-Wha...)

  - Entropy: The Truth, The Whole Truth, And Nothing But The Truth 
(https://www.amazon.com/Entropy-Truth-Whole-Nothing-But/dp/98...)

  - Four Laws That Do Not Drive The Universe, The: Elements Of Thermodynamics For The Curious And Intelligent 
(https://www.amazon.com/Four-Laws-That-Drive-Universe/dp/9813...)

  - TIME'S ARROW (?): The Timeless Nature of Entropy and the Second Law of Thermodynamics 
(https://www.amazon.com/TIMES-ARROW-Timeless-Entropy-Thermody...)

  - Entropy for Smart Kids and their Curious Parents 
(https://www.amazon.com/Entropy-Smart-their-Curious-Parents/d...)


Thanks, I do appreciate you extracting this. I did follow one or two of these originally. But I'm not going to buy an ebook on Amazon to figure out if he has anything useful to say.


LibGen?

"Shannon's Measure of Information and the Thermodynamic Entropy" https://doi.org/10.1063/1.3703629


> has anyone formalized the idea that the second law can be interpreted as a general inability of computer programs to predict one another?

This sounds like the halting problem.


Wolfram's "computational irreducibility" is basically the halting problem for physics.


There's something about attempts at entropy that feels entirely inadequate.

The definition of "law" in science isn't something that's true, just something that we haven't observed any exceptions to (yet). However entropy, is in larger systems entirely unmeasurable as far as I understand (is there even a unit for it?).

Rather than tackle-these scientific shortcomings head-on, most tend to gloss over it. Perhaps a good place to start is to talk about the limited places we can observe it, and the logical reasons that our observations in these limited places must apply more generally. And then to explain whether a unit exists or not, why it matters so much.


Entropy is a measure of the uncertainty that an observer has over the microstates (i.e. the exact state of every atom) given their knowledge of the macrostate (i.e. what they are capable of describing: like "the water is just above freezing"). So it's inherently a subjective concept. The most common unit for entropy is the bit, same unit as information.

Anyway, entropy is measurable just like any other physical quantity. You need to determine what your model of the microstates is (often an "ideal gas", with independent molecular point particles with their own positions and velocities, and all interactions are perfectly elastic collisions), what you currently know (stuff like type of gas, pressure/volume/temperature), and then there is a clear answer to what the entropy is.

The true law underlying the second law of thermodynamics is conservation of information. In (classical) physics, this is typically cast as Liouville's theorem, which shows that the area of the phase space of a system must remain constant. (In quantum, it's only true for unitary transformations, which may or may not be everything depending on your interpretation of QM).

Anyway, if you're curious about learning more, I highly recommend http://www.av8n.com/physics/thermo which is an amazing online (and free) book that clarifies the concepts of thermodynamics brilliantly.


>> Anyway, entropy is measurable just like any other physical quantity

Uh, I measure the mass of a baseball by putting it on a scale, and I get an objective number in kilograms. Similar for velocity, temperature, and volume.

So far as I know, entropy is pretty unique in that there isn't, and will never be an instrument that gives the number of bits in a baseball.

>> So it's inherently a subjective concept.

Yes, this is my point. Almost nothing in physics is subjective.


Yeah, it is different, but once you know what you're after you can measure. It's a "type error" to ask for the number of bits of entropy in a baseball, but if you ask for the bits of entropy of a baseball given everything you know about the baseball (i.e. mass, composition, temperature), then you can measure it. You could even design a special instrument which makes relevant measurements of observable quantities and then calculates the entropy of a given object from those.

Temperature is actually defined from entropy, it's the change in energy per change in entropy. So it too is inherently subjective. One way to think about this is that to a simulator or god outside of our universe, who can precisely see everything happening in the universe, the temperature and entropy of everything is exactly zero (of course, they would be able to predict what we would measure it as). To them, they would see the level of a thermometer as simply a mechanical consequence of all the particles nudging it to that exact place (like you would if you saw someone pump the mercury up the tube -- you wouldn't conclude it must have gotten much hotter suddenly).


> So far as I know, entropy is pretty unique in that there isn't, and will never be an instrument that gives the number of bits in a baseball.

Nonsense. Black Hole entropy is measurable in exactly the same way - put it on a scale, get mass in kilograms, from mass compute radius and area - voila, you've got an entropy


Could you recommend more like that?


Everything on that website is good like that, but unfortunately I wish I had more like it to recommend.


In classical thermo, entropy has units of J/K. It's the gradient of the Helmholtz free energy w.r.t. temperature (at fixed volume). I think experimentally one generally measures the heat capacity as a function of temperature, from which you can obtain the entropy.


I've tackled this problem. Basically most people, even smart people don't understand entropy correctly.

Entropy is not a law. It is a derivation. You can derive entropy mathematically from other axioms we assume are true in reality.

Let me state it plainly in a sentence that is not entirely correct but will help elucidate the meaning of entropy:

Systems tend toward disorder because there are more possible disordered configurations then there are ordered configurations.

This is entropy (sort of).

If your system is a bunch of 6 sided dice one ordered configuration is rolling a 1 on all the dice. Another ordered configuration is rolling a 5 on all the dice. A more likely but ordered configuration is rolling either a 1 or a 2.

There are numerous ordered configurations of a system but the number of unordered outcomes is by far greater then ordered outcomes. If you have a system of dice all facing up with 1s the system is in a low entropy ordered state. If you shake all the dice, it is more likely for the dice to come out in a random state all with random numbers. Shaking the dice is increasing entropy. Entropy increases. But you see here because you relate the dice to probability it's easy to see how by probability the dice should trend toward more disordered states.

The same phenomena goes for particles in a box. Instead of the numbers on the side of the dice replace it with the position of each particle. Shaking the dice is exactly isomorphic to particles traveling in random directions. By probability, dice will roll random disordered numbers just like how by probability randomly moving particles will move into randomly disordered positions.

The same probabilistic intuition that you have for why throwing a bunch of bricks at the ground won't be accident construct a house is exactly the intuition behind probability AND as a result entropy.

The real phenomena here is probability. Why does it apply to reality? We don't know. But entropy is just a consequence of it. Entropy is simply saying (again sort of) systems tend toward more disordered states because disordered states are more numerous and therefore more probable.

The axiom or "law" as you call it is probability and entropy is a theorem or derivation that is a consequence of probability.

This is just a really simple explanation of entropy. Entropy is actually more general then this. It applies even to systems where "ordered" states outnumber "unordered" states. You can see my other explanations in this thread I get into it deeply.


For some actual demystification for those more computationally inclined. https://www.falstad.com/gas/


Entropy is a phenomenon of probability. This is a very incorrect way to put it but it gets closer to the intuition of definition:

   Disorder is more probable than order that's why things trend towards disorder. 
However it is only a probabilistic phenomena so technically low probability things can still occur. This is what the article is saying and he proves it with a bunch of math.

But the above is still technically incorrect because Entropy has nothing to do with disorder per say. Entropy is defined relative to a "system" and we can actually pick different systems where as entropy goes up while things become more organized and more ordered!

Keep in mind that a system is as much something that exists as well as properties of something we can arbitrarily define.

Say a system of loaded dice:

If my system consists of 100 weighted dice that flip up at 6 almost every time then entropy is increasing as I roll all the dice but the system is trending towards order: more sixes facing up.

Another system that trends towards order is the solar system. Gravity pulls particles to form ordered circular orbits and ordered spheres/planets because that is the way the system is configured. You have a bunch of random atoms but like the weighted dice, gravity pushes the system towards order and organization. But entropy is increasing!

Ever wonder why people say entropy always increases but you see weird ordered stuff in nature all the time like snow flakes and crystals? In those cases entropy is still increasing but producing order.

Basically in the systems I described above Ordered configurations are More probable then disordered configurations thus while "order" is increasing, entropy is also increasing. Once you understand this you'll have a better intuition of entropy and why it doesn't have to do with "disorder."

When people talk about stuff like entropy always goes up they say things like whenever you see something self organize then something else in the universe must become more disordered don't know what they're talking about. Yeah entropy trends towards higher values but this is not equivalent to disorder.

As for life on earth. Life on earth is a low probability phenomena. Something else is going on in this case but that's another discussion.


The following papers make the argument that, at least for systems with a low initial entropy, "within a quantum mechanical framework, all phenomena which leave a trail of information behind (and hence can be studied by physics) are those where entropy necessarily increases or remains constant. All phenomena where the entropy decreases must not leave any information of their having happened. This situation is completely indistinguishable from their not having happened at all."

----

Quantum Solution to the Arrow-of-Time Dilemma

Lorenzo Maccone

Phys. Rev. Lett. 103, 080401 – Published 17 August 2009

The arrow-of-time dilemma states that the laws of physics are invariant for time inversion, whereas the familiar phenomena we see everyday are not (i.e., entropy increases). I show that, within a quantum mechanical framework, all phenomena which leave a trail of information behind (and hence can be studied by physics) are those where entropy necessarily increases or remains constant. All phenomena where the entropy decreases must not leave any information of their having happened. This situation is completely indistinguishable from their not having happened at all. In the light of this observation, the second law of thermodynamics is reduced to a mere tautology: physics cannot study those processes where entropy has decreased, even if they were commonplace.

https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.10...

----

[Submitted on 30 Dec 2009]

A quantum solution to the arrow-of-time dilemma: reply

Lorenzo Maccone

I acknowledge a flaw in the paper "A quantum solution to the arrow of time dilemma": as pointed out by Jennings and Rudolph, (classical) mutual information is not an appropriate measure of information. This can be traced back to the quantum description underlying my analysis, where quantum mutual information is the appropriate measure of information. The core argument of my paper (summarized in its abstract) is not affected by this flaw. Nonetheless, I point out that such argument may not be adequate to account for all phenomena: it seems necessary to separately postulate a low entropy initial state.

https://arxiv.org/abs/0912.5394


Perhaps the best way I began to understand entropy is when reading about tail inequalities and divergences. In particular, we can view the entropy as a sort of “distance to a uniform distribution.” (Specifically an additive factor away from the KL divergence between the uniform distribution and the distribution under measurement.) Studying the distributions from this perspective yields a bunch of important consequences like, how many samples do you need to tell one distribution from another? Or, how many bits are needed to approximately describe the distribution, from a more general family? Surprisingly, being able to answer these two questions in a precise sense gives you a whole load of tools to fun things like: can an algorithm decide a question with k bits of information? Or can two physical systems diverge in behavior in a meaningful way after T time?

For the (fairly) mathematically inclined reader, I very highly recommend Massart’s Concentration Inequalities and Model Selection.


> you’re probably never going to observe entropy going down in your life

Except when you tidy up your room.


But you're turning food into poop, and exhaling carbon dioxide, as you do so...


This is speaking of entropy and states of gasses.

Isn't there some generalization of entropy, such that even when sentient life-forms tried their damnedest, or even in black-holes, entropy must always trend up? Or was that a mischaracterization?


Ask three thermodynamics engineers, get five answers. The generalization you're referring to is pretty accurate, but according to some interpretations it's just assumed to exist a priori and then we look for evidence of it.

But while it's unclear whether entropy is the result of some universal law or a hack we use to make the math work out, it definitely does make the math work out in closed systems of gasses.


entropy is phenomenon of probability. It works because probability works.

People think entropy is some natural phenomenon in nature that is fundamental.

No. Entropy is a consequence of probability. The math works because probability happens to apply to nature. The below is the intuition behind why entropy occurs... once you will realize this it will make sense.

   Disordered states tend to be more numerous then ordered states that's why any random configuration of gas particles in a system is more likely to be disordered.

   Also when you try to purturb the state of random particles from state1 to state2, by probability state2 will be disordered because there are far more possible disordered states than ordered states... even when the initial state is ordered. Hence entropy increases. 
The philosophical thing people should be examining is the nature of probability and reality which we already know is intrinsically tied with the nature of fundamental particles.

Probability is an axiom, entropy is a theorem derived from the axiom of probability, the fact that it is called a "law" seems like it's fundamental but it's not, entropy can be derived assuming probability is true.


this seems like a good angle with which to approach the question. To my way of conceptualising, Entropy is something like the inverse of data. What makes one state less ordered than another? The more ordered state requires less data to be fully described.

In that sense there is an underlying reality to entropy, but it's impossible to be sure, an ordered state could seem disordered if we lack the data that would describe it. I think we understand entropy to the same degree we understand data, and as data science develops I hope we'll develop a better understanding of entropy as a consequence.

Heat makes for a good proxy though.


Your intuition gets you to the right place but I think it is a bit flawed.

Entropy must be described relative to a system and an arbitrary definition of macrostates and microstates. It actually has nothing to do with disorder or order, I used the term previously because it helps with intuition but it is actually categorically wrong.

For example for a system of 5 loaded dice that roll six 99% of the time. With a microstate defined as the value of each dice after a roll and a macrostate defined as the number of 6s.

With this system entropy goes up as you roll more sixes. Order also goes up with entropy. Entropy is an arbitrary concept that is defined relative to your choice of a "system", "macrostates" and "microstates." You can choose systems that have higher probabilities of being ordered like say magnetic cubes in a box versus regular cubes. The magnetic cubes are more likely to be stacked perfectly and that is defined as a higher entropic state even when there is "less" information needed to "describe" it.

I never read about information theory but I'm assuming that the choice of "system" in information theory is usually pretty simple as in you can have numbers like the dice, but you usually don't work with "loaded" dice. So under the case where the microstate of each entity has equal probability of occuring. In this case there is a direct relationship between whether data describing something can be compressed and the entropy of of that something. The higher the entropy the less it can be compressed.

Information places idealistic restrictions on microstates... However this is not the case in nature. If the microstate of our universe of atoms is defined as the cartesian coordinates of each atom. Then atoms have a tendency to coalesce into spheres (planets, starts, black holes) due to gravity and as a result each microstate does not have equal probability of occuring.

Planets are in fact a higher entropic state then the cloud of dust that the planet initially started out as. I can now describe all those atoms with in compressed form (a macrostate): "Planet." Thereby leading to an opposite relationship between compressibility and entropy. In this case the higher the entropy of a system the less information is needed to describe it.


Interesting. So when you say that entropy is an arbitrary concept, is it simply determined by the likeliest macrostates? So in the case of planets vs dust clouds, would we say that planets have higher entropy than dust clouds because entropy is defined as increasing with the likeliest states and we know that planets are likelier?

To put it another way, could someone us their understanding of entropy to make predictions about a system's probability distribution beyond what they already know of it? Or is the entropy of a state purely defined by its proximity to a basin of attraction?

I'm reminded of a debate in physics when the time light takes to travel between fixed points changes - a minority prefers to change the definition of C in m/s, while most prefer to change the value of m in the portion of the universe we're in at the time. To ask the same question in yet another way, Is Entropy like C to the majority? If it appears to decrease, it's our understanding of probabilities for the system that need to be reworked?


>To ask the same question in yet another way, Is Entropy like C to the majority? If it appears to decrease, it's our understanding of probabilities for the system that need to be reworked?

The popular intuition of entropy is often wrong and incomplete. The most general definition is here: https://en.wikipedia.org/wiki/Entropy_(information_theory)

You will see it is inline with what I'm talking about. Entropy is an arbitrary concept defined relative to your choice of a system (entities axioms and theorems) and arbitrary macrostates and arbitary microstates defined for the system.

Nobody has this down, they all talk about entropy without defining what kind of entropy they're talking about. Usually it's a half baked definition involving temperature and conservation of energy.

In general when you add conservation of energy into your system, microstates and macrostates then it becomes more inline with the popular notion of entropy. Because for atoms to self organize into planets they must stop moving, but because of conservation of energy that motion must be transferred into something else. So in general self organization in one part of the universe must mean another part of the universe gets hotter.

If I define a macrostate and microstate to account for conservation of energy meaning that my microstate must make sure that if a particle stops moving another one must start moving then things usually fits the notion of getting more disordered over time.

If I define the microstate to be just the position of atoms then things appear to become ordered over time as the position of atoms coalesce into spheres.

So there's different perspectives on it and all perspectives are true. It's just one perspective is examining the phenomenon of conservation of energy the other is not.


Not sure why you got flagged - I'll have a look at the link you suggested, maybe it'll help.


The laws of physics, as far as we understand and have checked, are exactly the same for sentient life, as they are for "dead" matter.

In any closed system (which might include sentient life) it is impossible for entropy to decrease, no matter how much the sentient life tries to stop this from happening.


The article above just proved it's possible using math. From an intuitive standpoint it's also easy to see why entropy can decrease. However the problem is, most people don't have a good intuition of what entropy is....

Entropy is a probabilistic phenomena. Meaning that entropy can go lower, it's just that there is a low probability of it happening.

So for example random particles in a closed system do have the potential of self organizing into a cube but it's just a very very low probability.

Even the sentence above distracts from what entropy truly is because our solar system is an example of particles self organizing into spheres rather than cubes.

Here's another example: Flat cylindrical Magnets will self organize into stacks. When the magnets are scattered entropy is lower, when the magnets self organize into a stack entropy has reached it's highest equilibrium point.

Once you understand why the above occurs in the context of entropy than you truly understand the concept. Until then most people don't understand it at all and explain away the self organization with some temperature increase happening somewhere else in the system.

The logic above holds the flaw that the system is still self organizing by shifting temperature to other regions in the system. Moving temperature or disorder to specific regions in a system is still self organization.


Or you could listen to the Flanders and Swann version: https://www.youtube.com/watch?v=VnbiVw_1FNs


I've got a Dover reprint somewhere wherein this is derived.

Many physical laws are discovered through experiment and tested, again and again. This one just ... arises from math.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: