What is Entropy?
September 23, 2024 5:52 AM   Subscribe

John Baez: "Once there was a thing called Twitter, where people exchanged short messages called ‘tweets’. While it had its flaws, I came to like it and eventually decided to teach a short course on entropy in the form of tweets. This little book is a slightly expanded version of that course." [PDF, 129 pages]
posted by Westringia F. (17 comments total) 31 users marked this as a favorite
 
*Confused. Clicks link.*

John, not Joan. OK. Makes more sense. Need more coffee.
posted by The Bellman at 6:00 AM on September 23 [28 favorites]


Joan is John's cousin!
posted by Westringia F. at 6:19 AM on September 23 [14 favorites]


Von Neumann's remark to Claude Shannon, founder of Information Theory.

https://mathoverflow.net/questions/403036/john-von-neumanns-remark-on-entropy

[snip]
...Von Neumann told me, 'You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage.'
posted by aleph at 6:41 AM on September 23 [14 favorites]




What is entropy, said Watson as the dandelions pushed through the floor of the bombed-out server room.
posted by aws17576 at 7:01 AM on September 23 [3 favorites]


^ that sounds like it's from the UNIX fortune file . . .
posted by torokunai at 7:54 AM on September 23 [1 favorite]


Fun fact: reading aloud while skimming though "What is Entropy?" can be considered a form of granular synthesis.
posted by Smart Dalek at 8:09 AM on September 23 [1 favorite]


Joan is John's cousin!

Good to know. I initially glossed over the first name and was wondering how the songstress got into teaching about twitter. (It was an amusing illusion while it lasted.)
posted by BWA at 8:19 AM on September 23


Joan is John's cousin

Mimi is Joan's sister, and she was married to Richard, who was Tom's best friend at university.

Tom is also a well-known expert on entropy
posted by chavenet at 10:12 AM on September 23 [6 favorites]


I just want the LaTeX code for the title page, really.

(Although the title isn't text, so I bet it's just massaged in an external graphic-manipulation program rather than some clever LaTeX wizardry)
posted by jackbishop at 12:28 PM on September 23 [1 favorite]



Joan is John's cousin

Mimi is Joan's sister, and she was married to Richard, who was Tom's best friend at university.

Tom is also a well-known expert on entropy


They were all in love with dying, they were doing it in Texas.
posted by aspersioncast at 2:12 PM on September 23 [16 favorites]


Or maybe there’s a fuzz parameter somewhere in Metafont &c, jackbishop… I’m curious too!
posted by clew at 3:57 PM on September 23


(Have not found a fuzz parameter. Startlingly, sqrt(2) is a parameter.)
posted by clew at 4:03 PM on September 23 [1 favorite]


Many physicists feel that it was a rather unfortunate turn of events that we developed the classical 2nd law of thermodynamics first, then we got down to the bits and developed the probability-based formulation of entropy later. A legacy of this history is that we measure entropy in joules per kelvin (J/K) which hardly makes any sense when you think of information (and this is something universally disliked by physicists). This turned out to be an artificial barrier, making it harder for physicists and computer scientists to talk to each other.

Some argue that we really shouldn't have made temperature a dimensionless quantity, and it would save us a lot of trouble by just measuring temperature in joules per bit. The article in the preceding link didn't reference it, but the same point was raised by Arieh Ben-Naim in his book "Entropy Demystified". This is not the same as making the Boltzmann constant 1, like every physicist does, but about making temperature carry the unit of energy and entropy free from the burden of carrying the unit of energy. Which makes intuitive sense. Temperature is energy. Entropy is a number, a degree of the lack of information.

The only problem is that we've defined our unit systems such that ..
While the authors are now adept of “temperature as joules/bit,” they unsurprisingly do not use these units to discuss the water temperature when they go swimming. In fact, most of them still use the Fahrenheit. Nevertheless, as 1 K is about 9.57 × 10−24 J/bit, their preferred water temperature would be
around 2.87 × 10−21 J/bit.
posted by runcifex at 7:09 PM on September 23 [6 favorites]


I initially glossed over the first name and was wondering how the songstress got into teaching about twitter.

Presumably via further insights into diamonds and rust.
posted by pattern juggler at 3:51 AM on September 24 [3 favorites]


The difference between entropy in joules per kelvin versus entropy in bits is arguably just a different choice of units for the Boltzmann constant. The entropy of a system with W different configurations is

S = k log W

as you may have read on Boltzmann's tombstone. You can think of it as an equation defining a bit in terms of SI units: (one bit)/(log 2) = (25 meV / 300 K). (I don't remember what tiny fraction of a joule corresponds to a milli-eV.)

Including the bit in the SI system would, as I understand it, introduce an inconsistency. The SI already defines "quantity of substance" in the mole, which is precisely 6.02 × 1023 things, where the actual definition includes about twelve-ish significant figures. Likewise the Boltzmann constant is defined exactly, which defines temperature in terms of energy. But the exact number of things per mole is just the best value we had at the time of the 2019 redefinition. We know that 240 combinations corresponds to exactly a terabit of entropy. I think if you want to count those combinations in moles, and you want to restrict yourself to rational number fractions of integer combinations per integer moles, you'll have to give up the exact definition of k. No such experiment is currently practical, however.

Eventually I suspect that the SI will include the bit as a "natural" unit, the way that we now use the natural and reproducible quantized units ℏ, e, etc. rather than macroscopic artifacts.
posted by fantabulous timewaster at 7:38 AM on September 24 [3 favorites]


Arnold Sommerfeld supposedly said:

"Thermodynamics is a funny subject. The first time you go through it, you don't understand it at all. The second time you go through it, you think you understand it, except for one or two small points. The third time you go through it, you know you don't understand it, but by that time you are so used to it, so it doesn't bother you any more."

As quoted in: J.Muller, Physical Chemistry in Depth (Springer Science & Business Media, 1992), p. 1. No primary source is given in that book.

Sounds like an early version of: "Just shut up and compute"

https://en.wikiquote.org/wiki/Arnold_Sommerfeld
posted by aleph at 10:59 AM on September 24 [1 favorite]


« Older Fredric Jameson: 1934-2024   |   The Case Against Roy Lichtenstein Newer »


You are not currently logged in. Log in or create a new account to post comments.