Thoughts of Chair-man Moire

My apples and orange peels question, Sy,  isn’t that the same as Jeremy’s?  What’s the connection between heat capacity and counting?”

“You’re right, Anne.  Hmm.  Say, Al, all your coffee shop tables came with four chairs apiece, right?”

“Yup, four-tops every one, even in the back room.”

“You neaten them all up, four to a table, in the morning?”

“The night before.  There’s never time in the morning, customers demand coffee first thing.”

“But look, we’ve got six people seated at this table.  Where’d the extra chairs come from?”

“Other tables, of course.  Is this going somewhere?”

“Almost there.  So in fact the state of the room at any time will have some random distribution of chairs to tables.  You know on the average there’ll be four at a table, but you don’t know the actual distribution until you look, right?”

“Hey, we’re counting again.  You’re gonna say that’s about entropy ’cause the difference between four at a table and some other number is all random and there’s some formula to calculate entropy from that.”elephants and chairs

“True, Vinnie, but we’re about to take the next step.  How did these chairs wind up around this table?”

“We pulled them over, Mr. Moire.”

“My point is, Jeremy, we spent energy to get them here.  The more chairs that are out of position — ”

“The higher the entropy, but also the more energy went into the chairs.  It’s like that heat capacity thing we started with, the energy that got absorbed rather than driving the steam engine.”

“Awright, Anne!” from Jeremy <Jennie bristles a bit>, “and if all the chairs are in Al’s overnight position it’s like absolute zero.  Hey, temperature is average kinetic energy per particle so can we say that the more often a chair gets moved it’s like hotter?”

Jennie breaks in.  “Not a bit of it, Jeremy!  The whole metaphor’s daft.  We know temperature change times heat capacity equals the energy absorbed, right, and we’ve got a link between energy absorption and entropy, right, but what about if at the end of the day all the chairs accidentally wind up four at a table?  Entropy change is zero, right, but customers expended energy moving chairs about all day and Al’s got naught to set straight.”

“Science in action, I love it!  Anne and Jeremy, you two just bridged a gap it took Science a century to get across.  Carnot started us on entropy’s trail in 1824 but scientists in those days weren’t aware of matter’s atomic structure.  They knew that stuff can absorb heat but they had no inkling what did the absorbing or how that worked.  Thirty years later they understood simple gases better and figured out that average kinetic energy per particle bit.  But not until the 1920s did we have the quantum mechanics to show how parts of vibrating molecules can absorb heat energy stepwise like a table ‘absorbing’ chairs.  Only then could we do Vinnie’s state-counting to calculate entropies.”

“Yeah, more energy, spread across more steps, hiding more details we don’t know behind an average, more entropy.  But what about Jennie’s point?”

“Science is a stack of interconnected metaphors, Vinnie.  Some are better than others.  The trick is attending to the boundaries where they stop being valid.  Jennie’s absolutely correct that my four-chair argument is only a cartoon for illustrating stepwise energy accumulation.  If Al had a billion tables instead of a dozen or so, the odds on getting everything back to the zero state would disappear into rounding error.”

“How does black hole entropy play into this, Sy?”TSE classical vs BH

“Not very well, actually.  Oh, sure, the two systems have similar structures.  They’ve each got three inter-related central quantities constrained by three laws.  Here, I’ve charted them out on Old Reliable.”

“OK, their Second and Third Laws look pretty much the same, but their First Laws don’t match up.”

“Right, Al.  And even Bekenstein pointed out inconsistencies between classic thermodynamic temperature and what’s come to be called Hawking temperature.  Hawking didn’t agree.  The theoreticians are still arguing.  Here’s a funny one — if you dig deep enough, both versions of the First Law are the same, but the Universe doesn’t obey it.”

“That’s it, closing time.  Everybody out.”

~~ Rich Olcott

Advertisements

Taming The Elephant

Suddenly they were all on the attack.  Anne got in the first lick.  “C’mon, Sy, you’re comparing apples and orange peel.  Your hydrogen sphere would be on the inside of the black hole’s event horizon, and Jeremy’s virtual particles are on the outside.”

[If you’ve not read my prior post, do that now and this’ll make more sense.  Go ahead, I’ll wait here.]white satin and 5 elephantsJennie’s turn — “Didn’t the chemists define away a whole lot of entropy when they said that pure elements have zero entropy at absolute zero temperature?”

Then Vinnie took a shot.  “If you’re counting maybe-particles per square whatever for the surface, shouldn’t you oughta count maybe-atoms or something per cubic whatever for the sphere?”

Jeremy posed the deepest questions. “But Mr Moire, aren’t those two different definitions for entropy?  What does heat capacity have to do with counting, anyhow?”

Al brought over mugs of coffee and a plate of scones.  “This I gotta hear.”

“Whew, but this is good ’cause we’re getting down to the nub.  First to Jennie’s point — Under the covers, Hawking’s evaluation is just as arbitrary as the chemists’.  Vinnie’s ‘whatever’ is the Planck length, lP=1.616×10-35 meter.  It’s the square root of such a simple combination of fundamental constants that many physicists think that lP2=2.611×10-70 m², is the ‘quantum of area.’  But that’s just a convenient assumption with no supporting evidence behind it.”

“Ah, so Hawking’s ABH=4πrs2 and SBH=ABH/4 formulation with rs measured in Planck-lengths, just counts the number of area-quanta on the event horizon’s surface.”

“Exactly, Jennie.  If there really is a least possible area, which a lot of physicists doubt, and if its size doesn’t happen to equal lP2, then the black hole entropy gets recalculated to match.”

“So what’s wrong with cubic those-things?”

“Nothing, Vinnie, except that volumes measured in lP3 don’t apply to a black hole because the interior’s really four-dimensional with time scrambled into the distance formulas.  Besides, Hawking proved that the entropy varies with half-diameter squared, not half-diameter cubed.”

“But you could still measure your hydrogen sphere with them and that’d get rid of that 1033 discrepancy between the two entropies.”

“Not really, Vinnie.  Old Reliable calculated solid hydrogen’s entropy for a certain mass, not a volume.”

“Hawking can make his arbitrary choice, Sy, he’s Hawking, but that doesn’t let the chemists off the scaffold.  How did they get away with arbitrarily defining a zero for entropy?”

“Because it worked, Jennie.  They were only concerned with changes — the difference between a system’s state at the end of a process, versus its state at the beginning.  It was only the entropy difference that counted, not its absolute value.”

“Hey, like altitude differences in potential energy.”

“Absolutely, Vinnie, and that’ll be important when we get to Jeremy’s question.  So, Jennie, if you’re only interested in chemical reactions and if it’s still in the 19th Century and the world doesn’t know about isotopes yet, is there a problem with defining zero entropy to be at a convenient set of conditions?”

“Well, but Vinnie’s Second Law says you can never get down to absolute zero so that’s not convenient.”

“Good point, but the Ideal Gas Law and other tools let scientists extrapolate experimentally measured properties down to extremely low temperatures.  In fact, the very notion of absolute zero temperature came from experiments where the volume of a  hydrogen or helium gas sample appears to decrease linearly towards zero at that temperature, at least until the sample condenses to a liquid.  With properly calibrated thermometers, physical chemists knocked themselves out measuring heat capacities and entropies at different temperatures for every substance they could lay hands on.”

“What about isotopes, Mr Moire?  Isn’t chlorine’s atomic weight something-and-a-half so there’s gotta be several of kinds of chlorine atoms so any sample you’ve got is a mixture and that’s random and that has to have a non-zero entropy even at absolute zero.”

“It’s 35.4, two stable isotopes, Jeremy, but we know how to account for entropy of mixing and anyway, the isotope mix rarely changes in chemical processes.”

“But my apples and orange peels, Sy — what does the entropy elephant do about them?”

~~ Rich Olcott