A Beetled Brow

Vinnie’s brow was wrinkling so hard I could hear it over the phone. “Boltzmann, Boltzmann, where’d I hear that name before? … Got it! That’s one of those constants, ain’t it, Sy? Molecules or temperature or something?”

“The second one, Vinnie. Avagadro was the molecule counter. Good memory. Come to think of it, both Boltzmann and Avagadro bridged gaps that Loschmidt worked on.”

“Loschmidt’s thing was the paradox, right, between Newton saying events can back up and thermodynamics saying no, they can’t. You said Boltzmann’s Statistical Mechanics solved that, but I’m still not clear how.”

“Let me think of an example. … Ah, you’ve got those rose bushes in front of your place. I’ll bet you’ve also put up a Japanese beetle trap to protect them.”

“Absolutely. Those bugs would demolish my flowers. The trap’s lure draws them away to my back yard. Most of them stay there ’cause they fall into the trap’s bag and can’t get out.”

“Glad it works so well for you. OK, Newton would look at individual beetles. He’d see right off that they fly mostly in straight lines. He’d measure the force of the wind and write down an equation for how the wind affects a beetle’s flight path. If the wind suddenly blew in the opposite direction, that’d be like the clock running backwards. His same equation would predict the beetle’s new flight path under the changed conditions. You with me?”

“Yeah, no problem.”

“Boltzmann would look at the whole swarm. He’d start by evaluating the average point‑to‑point beetle flight, which he’d call ‘mean free path.’ He’d probably focus on the flight speed and in‑the‑air time fraction. With those, if you tell him how many beetles you’ve got he could generate predictions like inter‑beetle separation and how long it’d take an incoming batch of beetles to cross your yard. However, predicting where a specific beetle will land next? Can’t do that.”

“Who cares about one beetle?”

“Well, another beetle might. …
Just thought of a way that Statistical Mechanics could actually be useful in this application. Once Boltzmann has his numbers for an untreated area, you could put in a series of checkpoints with different lures. Then he could develop efficiency parameters just by watching the beetle flying patterns. No need to empty traps. Anyhow, you get the idea.”

Japanese Beetle, photo by David Cappaert, Bugwood.org
under Creative Commons BY 3.0

“Hey, I feel good emptying that trap, I’m like standing up for my roses. Anyway, so how does Avagadro play into this?”

“Indirectly and he was half a century earlier. In 1805 Gay‑Lussac showed that if you keep the pressure and temperature constant, it tales two volumes of hydrogen to react with one volume of oxygen to produce one volume of water vapor. Better, the whole‑number‑ratio rule seemed to hold generally. Avagadro concluded that the only way Gay‑Lussac’s rule could be general is if at any temperature and pressure, equal volumes of every kind of gas held the same number of molecules. He didn’t know what that number was, though.”

“HAW! Avagadro’s number wasn’t a number yet.”

“Yeah, it took a while to figure out. Then in 1865, Loschmidt and a couple of others started asking, “How big is a gas molecule?” Some gases can be compressed to the liquid state. The liquids have a definite volume, so the scientists knew molecules couldn’t be infinitely small. Loschmidt put numbers to it. Visualize a huge box of beetles flying around, bumping into each other. Each beetle, or molecule, ‘occupies’ a cylinder one beetle wide and the length of its mean free path between collisions. So you’ve got three volumes — the beetles, the total of all the cylinders, and the much larger box. Loschmidt used ratios between the volumes, plus density data, to conclude that air molecules are about a nanometer wide. Good within a factor of three. As a side result he calculated the number of gas molecules per unit volume at any temperature and pressure. That’s now called Loschmidt’s Number. If you know the molecular weight of the gas, then arithmetic gives you Avagadro’s number.”

“Thinking about a big box of flying, rose‑eating beetles creeps me out.”

  • Thanks to Oriole Hart for the story‑line suggestion.

~~ Rich Olcott

Bridging A Paradox

<chirp, chirp> “Moire here.”

“Hi, Sy. Vinnie. Hey, I’ve been reading through some of your old stuff—”

“That bored, eh?”

“You know it. Anyhow, something just don’t jibe, ya know?”

“I’m not surprised but I don’t know. Tell me about it.”

“OK, let’s start with your Einstein’s Bubble piece. You got this electron goes up‑and‑down in some other galaxy and sends out a photon and it hits my eye and an atom in there absorbs it and I see the speck of light, right?”

“That’s about the size of it. What’s the problem?”

“I ain’t done yet. OK, the photon can’t give away any energy on the way here ’cause it’s quantum and quantum energy comes in packages. And when it hits my eye I get the whole package, right?”

“Yes, and?”

“And so there’s no energy loss and that means 100% efficient and I thought thermodynamics says you can’t do that.”

“Ah, good point. You’ve just described one version of Loschmidt’s Paradox. A lot of ink has gone into the conflict between quantum mechanics and relativity theory, but Herr Johann Loschmidt found a fundamental conflict between Newtonian mechanics, which is fundamental, and thermodynamics, which is also fundamental. He wasn’t talking photons, of course — it’d be another quarter-century before Planck and Einstein came up with that notion — but his challenge stood on your central issue.”

“Goody for me, so what’s the central issue?”

“Whether or not things can run in reverse. A pendulum that swings from A to B also swings from B to A. Planets go around our Sun counterclockwise, but Newton’s math would be just as accurate if they went clockwise. In all his equations and everything derived from them, you can replace +t with ‑t to make run time backwards and everything looks dandy. That even carries over to quantum mechanics — an excited atom relaxes by emitting a photon that eventually excites another atom, but then the second atom can play the same game by tossing a photon back the other way. That works because photons don’t dissipate their energy.”

“I get your point, Newton-style physics likes things that can back up. So what’s Loschmidt’s beef?”

“Ever see a fire unburn? Down at the microscopic level where atoms and photons live, processes run backwards all the time. Melting and freezing and chemical equilibria depend upon that. Things are different up at the macroscopic level, though — once heat energy gets out or randomness creeps in, processes can’t undo by themselves as Newton would like. That’s why Loschmidt stood the Laws of Thermodynamics up against Newton’s Laws. The paradox isn’t Newton’s fault — the very idea of energy was just being invented in his time and of course atoms and molecules and randomness were still centuries away.”

“Micro, macro, who cares about the difference?”

“The difference is that the micro level is usually a lot simpler than the macro level. We can often use measured or calculated micro‑level properties to predict macro‑level properties. Boltzmann started a whole branch of Physics, Statistical Mechanics, devoted to carrying out that strategy. For instance, if we know enough about what happens when two gas molecules collide we can predict the speed of sound through the gas. Our solid‑state devices depend on macro‑level electric and optical phenomena that depend on micro‑level electron‑atom interactions.”

“Statistical?”

“As in, ‘we don’t know exactly how it’ll go but we can figure the odds…‘ Suppose we’re looking at air molecules and the micro process is a molecule moving. It could go left, right, up, down, towards or away from you like the six sides of a die. Once it’s gone left, what are the odds it’ll reverse course?”

“About 16%, like rolling a die to get a one.”

“You know your odds. Now roll that die again. What’s the odds of snake‑eyes?”

“16% of 16%, that’s like 3 outa 100.”

“There’s a kajillion molecules in the room. Roll the die a kajillion times. What are the odds all the air goes to one wall?”

“So close to zero it ain’t gonna happen.”

“And Boltzmann’s Statistical Mechanics explained why not.”

“Knowing about one molecule predicts a kajillion. Pretty good.”

San Francisco’s Golden Gate Bridge, looking South
Photo by Rich Niewiroski Jr. / CC BY 2.5

~~ Rich Olcott

Free Energy, or Not

From: Richard Feder <rmfeder@fortleenj.com>
To: Sy Moire <sy@moirestudies.com>
Subj: Questions

What’s this about “free energy”? Is that energy that’s free to move around anywhere? Or maybe the vacuum energy that this guy said is in the vacuum of space that will transform the earth into a wonderful world of everything for free for everybody forever once we figure out how to handle the force fields and pull energy out of them?


From: Sy Moire <sy@moirestudies.com>
To: Richard Feder <rmfeder@fortleenj.com>

Subj: Re: Questions

Well, Mr Feder, as usual you have a lot of questions all rolled up together. I’ll try to take one at a time.

It’s clear you already know that to make something happen you need energy. Not a very substantial definition, but then energy is an abstract thing it took humanity a couple of hundred years to get our minds around and we’re still learning.

Physics has several more formal definitions for “energy,” all clustered around the ability to exert force to move something and/or heat something up. The “and/or” is the kicker, because it turns out you can’t do just the moving. As one statement of the Second Law of Thermodynamics puts it, “There are no perfectly efficient processes.”

For example, when your car’s engine burns a few drops of gasoline in the cylinder, the liquid becomes a 22000‑times larger volume of hot gas that pushes the piston down in its power stroke to move the car forward. In the process, though, the engine heats up (wasted energy), gases exiting the cylinder are much hotter than air temperature (more wasted energy) and there’s friction‑generated heat all through the drive train (even more waste). Improving the drive train’s lubrication can reduce friction, but there’s no way to stop energy loss into heated-up combustion product molecules.

Two hundred years of effort haven’t uncovered a usable loophole in the Second Law. However, we have been able to quantify it. Especially for practically important chemical reactions, like burning gasoline, scientists can calculate how much energy the reaction product molecules will retain as heat. The energy available to do work is what’s left.

For historical reasons, the “available to do work” part is called “free energy.” Not free like running about like ball lightning, but free in the sense of not being bound up in jiggling heated‑up molecules.

Vacuum energy is just the opposite of free — it’s bound up in the structure of space itself. We’ve known for a century that atoms waggle back and forth within their molecules. Those vibrations give rise to the infrared spectra we use for remote temperature sensing and for studying planetary atmospheres. One of the basic results of quantum mechanics is that there’s a minimum amount of motion, called zero‑point vibration, that would persist even if the molecule were frozen to absolute zero temperature.

There are other kinds of zero‑point motion. We know of two phenomena, the Casimir effect and the Lamb shift, that can be explained by assuming that the electric field and other force fields “vibrate” at the ultramicroscopic scale even in the absence of matter. Not vibrations like going up and down, but like getting more and less intense. It’s possible that the same “vibrations” spark radioactive decay and some kinds of light emission.

Visualize space being marked off with a mesh of cubes. In each cube one or more fields more‑or‑less periodically intensify and then relax. The variation strength and timing are unpredictable. Neighboring squares may or may not sync up and that’s unpredictable, too.

The activity is all governed by yet another Heisenberg’s Uncertainty Principle trade‑off. The stronger the intensification, the less certain we can be about when or where the next one will happen.

What we can say is that whether you look at a large volume of space (even an atom is ultramicroscopicly huge) or a long period of time (a second might as well be a millennium), on the average the intensity is zero. All our energy‑using techniques involve channeling energy from a high‑potential source to a low‑potential sink. Vacuum energy sources are everywhere but so are the sinks and they all flit around. Catching lightning in a jar was easy by comparison.

Regards,
Sy Moire.

~~ Rich Olcott

Sisyphus on A Sand Dune

I’m walking the park’s paths on a lovely early Spring day when, “There you are, Moire. I got a question!”

“As you always do, Mr Feder. What’s your question this time?”

“OK, this guy’s saying that life is all about fighting entropy but entropy always increases anyway. I seen nothing in the news about us fighting entropy so where’s he get that? Why even bother if we’re gonna lose anyway? Where’s it coming from? Can we plug the holes?”

“That’s 4½ questions with a lot of other stuff hiding behind them. You’re going to owe me pizza at Eddie’s AND a double-dip gelato.”

“You drive a hard bargain, Moire, but you’re on.”

“Deal. Let’s start by clearing away some underbrush. You seem to have the idea that entropy’s a thing, like water, that it flows around and somehow seeps into our Universe. None of that’s true.”

“That makes no sense. How can what we’ve got here increase if it doesn’t come from somewhere?”

“Ah, I see the problem — conservation. Physicists say there are two kinds of quantities in the Universe — conserved and non‑conserved. The number of cards in a deck is is a conserved quantity because it’s always 52, right?”

“Unless you’re in a game with Eddie.”

“You’ve learned that lesson, too, eh? With Eddie the system’s not closed because he occasionally adds or removes a card. Unless we catch him at it and that’s when the shouting starts. So — cards are non-conserved if Eddie’s in the game. Anyway, energy’s a conserved quantity. We can change energy from one form to another but we can’t create or extinguish energy, OK?”

“I heard about that. Sure would be nice if we could, though — electricity outta nothing would save the planet.”

“It would certainly help, and so would making discarded plastic just disappear. Unfortunately, mass is another conserved quantity unless you’re doing subatomic stuff. Physicists have searched for other conserved quantities because they make calculations simpler. Momentum‘s one, if you’re careful how you define it. There’s about a dozen more. The mass of water coming out of a pipe exactly matches the mass that went in.”

“What if the pipe leaks?”

“Doesn’t matter where the water comes out. If you measure the leaked mass and the mass at the pipe’s designed exit point the total outflow equals the inflow. But that gets me to the next bit of underbrush. Energy’s conserved, that’s one of our bedrock rules, but energy always leaks and that’s another bedrock rule. The same rule also says that matter always breaks into smaller pieces if you give it a chance though that’s harder to calculate. We measure both leakages as entropy. Wherever you look, any process that converts energy or matter from one form to another diverts some fraction into bits of matter in random motion and that’s an increase of entropy. One kind of entropy, anyway.”

“Fine, but what’s all this got to do with life?”

“It’s all to get us to where we can talk about entropy in context. You’re alive, right?”

“Last I looked.”

“Ever break a bone?”

<taps his arm> “Sure, hasn’t everybody one time or another?”

“Healed up pretty well, I see. Congratulations. Right after the break that arm could have gone in lots of directions it’s not supposed to — a high entropy situation. So you wore a cast while your bone cells worked hard to knit you together again and lower that entropy. Meanwhile, the rest of your body kept those cells supplied with energy and swept away waste products. You see my point?”

“So what you’re saying is that mending a broken part uses up energy and creates entropy somewhere even though the broken part is less random. I got that.”

“Oh, it goes deeper than that. If you could tag one molecule inside a living cell you’d see it bouncing all over the place until it happens to move where something grabs it to do something useful. Entropy pushes towards chaos, but the cell’s pattern of organized activity keeps chaos in check. Like picnicking on a windy day — only constant vigilance maintains order. That’s the battle.”

“Hey, lookit, Eddie’s ain’t open. I’ll owe you.”

“Pizza AND double-dip gelato.”

~~ Rich Olcott

At The Old Curiosity Shop

An imposing knock at the door, both impetuous and imperious.  I figured it for an Internet denizen.  “C’mon in, the door’s open.”

“You’re Moire?”

“I am.  And you are..?”

“The name’s Feder, Richard Feder, from Fort Lee, NJ.  I’m a stand-in for some of your commenters.”

“Ah, the post of business past.  You have a question?”

“Yeah.  How come hot water can freeze faster than cold water?”

“That’s really two questions. The first is, ‘Can hot water freeze faster than cold water?’ and the second is, ‘How come?‘  To the surprise of a lot of physicists, the experimental answer to the first question is, ‘Yes, sometimes.‘  But it’s only sometimes and even that depends on how you define freeze.”

“What’s to define?  Frozen is frozen.”

“Not so fast.  Are we talking surface ice formation, or complete solidification, or maybe just descent to freezing temperature?  Three very different processes.  There’s multiple reports of anomalous behavior for each one, but many of the reports have been contested by other researchers.  Lots of explanations, too.  The situation reminds me of Anne’s Elephant.”

“Why an elephant?  And who’s Anne?”

“Remember the old story about the blind men trying to figure out an elephant?  The guy touching its trunk said it’s a snake, the one at its side said it’s a wall, the dude at its leg said it’s a tree, and so on?  The descriptions differed because each observer had limited knowledge of something complicated.  This chilled-water issue is like that — irreproducible experiments because of uncontrolled unknown variables, mostly maybes on the theory side because we’re still far from a fundamental understanding.”

“Who’s Anne?”

“Anne is … an experience.  I showed her how the notion of Entropy depends on how you look at it.  Scientists have looked at this paradoxical cooling effect pretty much every way you can think of, trying to rule out various hypotheses.  Different teams have both found and not found the anomaly working with distilled water and with tap water, large amounts and small, in the open air and in sealed containers, in glass or metal containers, with and without stirring, with various pre-washing regimens or none, using a variety of initial and final temperatures.  They’ve clocked the first appearance of surface ice and complete opacity of the bulk.  They’ve tracked temperature’s trajectory in the middle of the container or near its wall… you name it.  My favorite observation was the 20th Century’s first-published one — in 1963 Erasto Mpemba noticed the effect while preparing ice cream.”

“What flavor?  Never mind.  Is there a verdict?”

“Vaguely.  Once you get approximately the right conditions, whether or not you see the effect seems to be a matter of chance.  The more sophisticated researchers have done trials in the hundreds and then reported percentages, rather than just ‘we see it’ or not.  Which in itself is interesting.”many elephants

“How’s that?”

“Well, to begin with, the percents aren’t zero.  That answers your first question — warm water sometimes does freeze faster than cold.  Better yet, the variability tells us that the answer to your second question is at the nanoscopic level.  Macroscopic processes, even chemical ones, have statistics that go the same way all the time.  Put a lit match to gasoline in air, you’ll always get a fire.  But if you set out 100 teaspoons of water under certain conditions and 37 of them freeze and the others don’t, something very unusual must be going on that starts with just a few molecules out of the 10²³ in those teaspoons.”

“Weird odds.”

This experiment’s even more interesting.  You’ve got two bottles of water.  You heat up bottle A and let it cool to room temperature.  B‘s been at room temperature all along.  You put ’em both in the fridge and track their temperatures.  A cools quicker.”

“That’s where I came in.”

“Both start at the same temperature, finish at the same temperature, and their Joules-per-second energy-shedding rates should be the same.  A cools in less time so A releases less heat.  Entropy change is released heat energy divided by temperature.  Somehow, bottle A went into the fridge with less entropy than B had.  Why?  We don’t really know.”

~~ Rich Olcott

  • – Thanks to Ilias Tirovolas, whose paper inspired this post.

Meanwhile, back at the office

Closing time.  Anne and I stroll from Al’s coffee shop back to the Acme Building.  It’s a clear night with at least 4,500 stars, but Anne’s looking at the velvet black between them.

“What you said, Sy, about the Universe not obeying Conservation of Energy — tell me more about that.”

“Aaa-hmmm … OK.  You’ve heard about the Universe expanding, right?”

“Ye-es, but I don’t know why that happens.”

“Neither do the scientists, but there’s pretty firm evidence that it’s happening, if only at the longest scales.  Stars within galaxies get closer together as they radiate away their gravitational energy.  But the galaxies themselves are getting further apart, as far out as we can measure.”

“What’s that got to do with Conservation of Energy?”

“Well, galaxies have mass so they should be drawn together by gravity the way that gravity pulls stars together inside galaxies.  But that’s not what’s happening.  Something’s actively pushing galaxies or galaxy clusters away from each other.  Giving the something a name like ‘dark energy‘ is just an accounting gimmick to pretend the First Law is still in effect at very large distances — we don’t know the energy source for the pushing, or even if there is one.  There’s a separate set of observations we attribute to a ‘dark energy‘ that may or may not have the same underlying cause.  That’s what I was talking about.”Fading white satin

We’re at the Acme Building.  I flash my badge to get us past Security and into the elevator.  As I reach out to press the ’12’ button she puts her hand on my arm.  “Sy, I want to see if I understand this entropy-elephant thing.  You said entropy started as an accounting gimmick, to help engineers keep track of fuel energy escaping into the surroundings.  Energy absorbed at one temperature they called the environment’s heat capacity.  Total energy absorbed over a range of temperatures, divided by the difference in temperature, they called change in entropy.”

The elevator lets us out on my floor and we walk to door 1217.  “You’ve got it right so far, Anne.  Then what?”

“Then the chemists realized that you can predict how lots of systems will work from only knowing a certain set of properties for the beginning and end states.  Pressure, volume, chemical composition, whatever, but also entropy.  But except for simple gases they couldn’t predict heat capacity or entropy, only measure it.”

My key lets us in.  She leans back against the door frame.  “That’s where your physicists come in, Sy.  They learned that heat in a substance is actually the kinetic energy of its molecules.  Gas molecules can move around, but that motion’s constrained in liquids and even more constrained in solids.  Going from solid to liquid and from liquid to gas absorbs heat energy in breaking those constraints.  That absorbed heat appears as increased entropy.”

She’s lounging against my filing cabinet.  “The other way that substances absorb heat is for parts of molecules to rotate and vibrate relative to other parts.  But there are levels.  Some vibrations excite easier than others, and many rotations are even easier.  In a cold material only some motions are active.  Rising temperature puts more kinds of motion into play.  Heat energy spreads across more and more sub-molecular absorbers.”

She’s perched on the edge of my desk.  “Here’s where entropy as possibility-counting shows up.  More heat, more possibilities, more entropy.  Now we can do arithmetic and prediction instead of measuring.  Anything you can count possibilities for you can think about defining an entropy for, like information bits or black holes or socks.  But it’ll be a different entropy, with its own rules and its own range of validity.  … And…”Riding the Elephant

She’s looming directly over me.  Her dark eyes are huge.

“And…?”

When we first met, Sy, you asked what you could do for me.  You’ve helped me see that when I travel across time and probability I’m riding the Entropy Elephant.  I’d like to show my appreciation.  Can you think of a possibility?”

A dark night, in a city that knows how to keep its secrets.  On the 12th floor of the Acme Building, one man still tries to answer the Universe’s persistent questions — Sy Moire, Physics Eye.

~~ Rich Olcott

Thoughts of Chair-man Moire

My apples and orange peels question, Sy,  isn’t that the same as Jeremy’s?  What’s the connection between heat capacity and counting?”

“You’re right, Anne.  Hmm.  Say, Al, all your coffee shop tables came with four chairs apiece, right?”

“Yup, four-tops every one, even in the back room.”

“You neaten them all up, four to a table, in the morning?”

“The night before.  There’s never time in the morning, customers demand coffee first thing.”

“But look, we’ve got six people seated at this table.  Where’d the extra chairs come from?”

“Other tables, of course.  Is this going somewhere?”

“Almost there.  So in fact the state of the room at any time will have some random distribution of chairs to tables.  You know on the average there’ll be four at a table, but you don’t know the actual distribution until you look, right?”

“Hey, we’re counting again.  You’re gonna say that’s about entropy ’cause the difference between four at a table and some other number is all random and there’s some formula to calculate entropy from that.”elephants and chairs

“True, Vinnie, but we’re about to take the next step.  How did these chairs wind up around this table?”

“We pulled them over, Mr. Moire.”

“My point is, Jeremy, we spent energy to get them here.  The more chairs that are out of position — ”

“The higher the entropy, but also the more energy went into the chairs.  It’s like that heat capacity thing we started with, the energy that got absorbed rather than driving the steam engine.”

“Awright, Anne!” from Jeremy <Jennie bristles a bit>, “and if all the chairs are in Al’s overnight position it’s like absolute zero.  Hey, temperature is average kinetic energy per particle so can we say that the more often a chair gets moved it’s like hotter?”

Jennie breaks in.  “Not a bit of it, Jeremy!  The whole metaphor’s daft.  We know temperature change times heat capacity equals the energy absorbed, right, and we’ve got a link between energy absorption and entropy, right, but what about if at the end of the day all the chairs accidentally wind up four at a table?  Entropy change is zero, right, but customers expended energy moving chairs about all day and Al’s got naught to set straight.”

“Science in action, I love it!  Anne and Jeremy, you two just bridged a gap it took Science a century to get across.  Carnot started us on entropy’s trail in 1824 but scientists in those days weren’t aware of matter’s atomic structure.  They knew that stuff can absorb heat but they had no inkling what did the absorbing or how that worked.  Thirty years later they understood simple gases better and figured out that average kinetic energy per particle bit.  But not until the 1920s did we have the quantum mechanics to show how parts of vibrating molecules can absorb heat energy stepwise like a table ‘absorbing’ chairs.  Only then could we do Vinnie’s state-counting to calculate entropies.”

“Yeah, more energy, spread across more steps, hiding more details we don’t know behind an average, more entropy.  But what about Jennie’s point?”

“Science is a stack of interconnected metaphors, Vinnie.  Some are better than others.  The trick is attending to the boundaries where they stop being valid.  Jennie’s absolutely correct that my four-chair argument is only a cartoon for illustrating stepwise energy accumulation.  If Al had a billion tables instead of a dozen or so, the odds on getting everything back to the zero state would disappear into rounding error.”

“How does black hole entropy play into this, Sy?”TSE classical vs BH

“Not very well, actually.  Oh, sure, the two systems have similar structures.  They’ve each got three inter-related central quantities constrained by three laws.  Here, I’ve charted them out on Old Reliable.”

“OK, their Second and Third Laws look pretty much the same, but their First Laws don’t match up.”

“Right, Al.  And even Bekenstein pointed out inconsistencies between classic thermodynamic temperature and what’s come to be called Hawking temperature.  Hawking didn’t agree.  The theoreticians are still arguing.  Here’s a funny one — if you dig deep enough, both versions of the First Law are the same, but the Universe doesn’t obey it.”

“That’s it, closing time.  Everybody out.”

~~ Rich Olcott

Taming The Elephant

Suddenly they were all on the attack.  Anne got in the first lick.  “C’mon, Sy, you’re comparing apples and orange peel.  Your hydrogen sphere would be on the inside of the black hole’s event horizon, and Jeremy’s virtual particles are on the outside.”

[If you’ve not read my prior post, do that now and this’ll make more sense.  Go ahead, I’ll wait here.]white satin and 5 elephantsJennie’s turn — “Didn’t the chemists define away a whole lot of entropy when they said that pure elements have zero entropy at absolute zero temperature?”

Then Vinnie took a shot.  “If you’re counting maybe-particles per square whatever for the surface, shouldn’t you oughta count maybe-atoms or something per cubic whatever for the sphere?”

Jeremy posed the deepest questions. “But Mr Moire, aren’t those two different definitions for entropy?  What does heat capacity have to do with counting, anyhow?”

Al brought over mugs of coffee and a plate of scones.  “This I gotta hear.”

“Whew, but this is good ’cause we’re getting down to the nub.  First to Jennie’s point — Under the covers, Hawking’s evaluation is just as arbitrary as the chemists’.  Vinnie’s ‘whatever’ is the Planck length, lP=1.616×10-35 meter.  It’s the square root of such a simple combination of fundamental constants that many physicists think that lP2=2.611×10-70 m², is the ‘quantum of area.’  But that’s just a convenient assumption with no supporting evidence behind it.”

“Ah, so Hawking’s ABH=4πrs2 and SBH=ABH/4 formulation with rs measured in Planck-lengths, just counts the number of area-quanta on the event horizon’s surface.”

“Exactly, Jennie.  If there really is a least possible area, which a lot of physicists doubt, and if its size doesn’t happen to equal lP2, then the black hole entropy gets recalculated to match.”

“So what’s wrong with cubic those-things?”

“Nothing, Vinnie, except that volumes measured in lP3 don’t apply to a black hole because the interior’s really four-dimensional with time scrambled into the distance formulas.  Besides, Hawking proved that the entropy varies with half-diameter squared, not half-diameter cubed.”

“But you could still measure your hydrogen sphere with them and that’d get rid of that 1033 discrepancy between the two entropies.”

“Not really, Vinnie.  Old Reliable calculated solid hydrogen’s entropy for a certain mass, not a volume.”

“Hawking can make his arbitrary choice, Sy, he’s Hawking, but that doesn’t let the chemists off the scaffold.  How did they get away with arbitrarily defining a zero for entropy?”

“Because it worked, Jennie.  They were only concerned with changes — the difference between a system’s state at the end of a process, versus its state at the beginning.  It was only the entropy difference that counted, not its absolute value.”

“Hey, like altitude differences in potential energy.”

“Absolutely, Vinnie, and that’ll be important when we get to Jeremy’s question.  So, Jennie, if you’re only interested in chemical reactions and if it’s still in the 19th Century and the world doesn’t know about isotopes yet, is there a problem with defining zero entropy to be at a convenient set of conditions?”

“Well, but Vinnie’s Second Law says you can never get down to absolute zero so that’s not convenient.”

“Good point, but the Ideal Gas Law and other tools let scientists extrapolate experimentally measured properties down to extremely low temperatures.  In fact, the very notion of absolute zero temperature came from experiments where the volume of a  hydrogen or helium gas sample appears to decrease linearly towards zero at that temperature, at least until the sample condenses to a liquid.  With properly calibrated thermometers, physical chemists knocked themselves out measuring heat capacities and entropies at different temperatures for every substance they could lay hands on.”

“What about isotopes, Mr Moire?  Isn’t chlorine’s atomic weight something-and-a-half so there’s gotta be several of kinds of chlorine atoms so any sample you’ve got is a mixture and that’s random and that has to have a non-zero entropy even at absolute zero.”

“It’s 35.4, two stable isotopes, Jeremy, but we know how to account for entropy of mixing and anyway, the isotope mix rarely changes in chemical processes.”

“But my apples and orange peels, Sy — what does the entropy elephant do about them?”

~~ Rich Olcott

The Battle of The Entropies

(the coffee-shop saga continues)  “Wait on, Sy, a black hole is a hollow sphere?”

I hadn’t noticed her arrival but there was Jennie, standing by Vinnie’s table and eyeing Jeremy who was sill eyeing Anne in her white satin.white satin and 2 elephants“That’s not quite what I said, Jennie.  Old Reliable’s software and and I worked up a hollow-shell model and to my surprise it’s consistent with one of Stephen Hawking’s results.  That’s a long way from saying that’s what a black hole is.”

“But you said some physicists say that.  Have they aught to stand on?”

“Sort of.  It’s a perfect case of ‘depends on where you’re standing.'”

Vinnie looked up.  “It’s frames again, ain’t it?”

“With black holes it’s always frames, Vinnie.  Hey, Jeremy, is a black hole something you could stand on?”

“Nosir, we said the hole’s event horizon is like Earth’s orbit, just a mathematical marker.  Except for the gravity and  the  three  Perils  Jennie and you and me talked about, I’d slide right through without feeling anything weird, right?”

“Good memory and just so.  In your frame of reference there’s nothing special about that surface — you wouldn’t experience scale changes in space or time when you encounter it.  In other frames, though, it’s special.  Suppose we’re standing a thousand miles away from a solar-size black hole and Jeremy throws a clock and a yardstick into it.  What would we see?”

“This is where those space compression and time dilation effects happen, innit?”

“You bet, Jennie.  Do you remember the formula?”

“I wrote it in my daybook … Ah, here it is —Schwarzchild factorMy notes say D is the black hole’s diameter and d is another object’s distance from its center.  One second in the falling object’s frame would look like f seconds to us.  But one mile would look like 1/f miles.  The event horizon is where d equals the half-diameter and f goes infinite.  The formula only works where the object stays outside the horizon.”

“And as your clock approaches the horizon, Jeremy…?”

“You’ll see my clock go slower and slower until it sto —.  Oh.  Oh!  That’s why those physicists think all the infalling mass is at the horizon, the stuff falls towards it forever and never makes it through.”

“Exactly.”

“Hey, waitaminute!  If all that mass never gets inside, how’d the black hole get started in the first place?”

“That’s why it’s only some physicists, Vinnie.  The rest don’t think we understand the formation process well enough to make guesses in public.”

“Wait, that formula’s crazy, Sy.  If something ever does get to where d is less than D/2, then what’s inside the square root becomes negative.  A clock would show imaginary time and a yardstick would go imaginary, too.  What’s that about?”

“Good eye, Anne, but no worries, the derivation of that formula explicitly assumes a weak gravitational field.  That’s not what we’ve got inside or even close to the event horizon.”

“Mmm, OK, but I want to get back to the entropy elephant.  Does black hole entropy have any connection to the other kinds?”

Strutural, mostly.  The numbers certainly don’t play well together.  Here’s an example I ran up recently on Old Reliable.  Say we’ve got a black hole twice the mass of the Sun, and it’s at the Hawking temperature for its mass, 12 billionths of a Kelvin.  Just for grins, let’s say it’s made of solid hydrogen.  Old Reliable calculated two entropies for that thing, one based on classical thermodynamics and the other based on the Bekenstein-Hawking formulation.”Entropy calculations“Wow, Old Reliable looks up stuff and takes care of unit conversions automatically?”

“Slick, eh, Jeremy?  That calculation up top for Schem is classical chemical thermodynamics.  A pure sample of any element at absolute zero temperature is defined to have zero entropy.  Chemical entropy is cumulative heat capacity as the sample warms up.  The Hawking temperature is so close to zero I could treat heat capacity as a constant.

“In the middle section I calculated the object’s surface area in square Planck-lengths lP², and in the bottom section I used Hawking’s formula to convert area to B-H entropy, SBH.  They disagree by a factor of 1033.”

A moment of shocked silence, and then…

~~ Rich Olcott

Rockfall

<continued>  The coffee shop crowd had gotten rowdy in response to my sloppy physics, but everyone hushed when I reached for my holster and drew out Old Reliable.  All had heard of it, some had seen it in action — a maxed-out tablet with customized math apps on speed-dial.

“Let’s take this nice and slow.  Suppose we’ve got an non-charged, non-spinning solar-mass black hole.  Inside its event horizon the radius gets weird but let’s pretend we can treat the object like a simple sphere.  The horizon’s half-diameter, we’ll call it the radius, is rs=2G·M/c²G is Newton’s gravitational constant, M is the object’s mass and c is the speed of light.  Old Reliable says … about 3 kilometers.  Question is, what happens when we throw a rock in there?  To keep things simple, I’m going to model dropping the rock gentle-like, dead-center and with negligible velocity relative to the hole, OK?”

<crickets>

“Say the rock has the mass of the Earth, almost exactly 3×10-6 the Sun’s mass.  The gravitational potential energy released when the rock hits the event horizon from far, far away would be E=G·M·m/rs, which works out to be … 2.6874×1041 joules.  What happens to that energy?”falling rock and black hole

rs depends on mass, Mr Moire, so the object will expand.  Won’t that push on what’s around it?”

“You’re thinking it’d act like a spherical piston, Jeremy, pushing out in all directions?”

“Yeah, sorta.”

“After we throw in a rock with mass m, the radius expands from rs to rp=2G·(M+m)/c².  I set m to Earth’s mass and Old Reliable says the new radius is … 3.000009 kilometers.  Granted the event horizon is only an abstract math construct, but suppose it’s a solid membrane like a balloon’s skin.  When it expands by that 9 millimeters, what’s there to push against?  The accretion disk?  Those rings might look solid but they’re probably like Saturn’s rings — a collection of independent chunks of stuff with an occasional gas molecule in-between.  Their chaotic orbits don’t have a hard-edged boundary and wouldn’t notice the 9-millimeter difference.  Inward of the disk you’ve got vacuum.  A piston pushing on vacuum expends zero energy.  With no pressure-volume work getting done that can’t be where the infall energy goes.”

“How about lift-a-weight work against the hole’s own gravity?”

“That’s a possibility, Vinnie.  Some physicists maintain that a black hole’s mass is concentrated in a shell right at the event horizon.  Old Reliable here can figure how much energy it would take to expand the shell that extra 9 millimeters.  Imagine that simple Newtonian physics applies — no relativistic weirdness.  Newton proved that a uniform spherical shell’s gravitational attraction is the same as what you’d get from having the same mass sitting at the shell’s geometric center.  The gravitational pull the shell exerts on itself originally was E=G·M²/rs.  Lifting the new mass from rs to rp will cost ΔE=G·(M+m)²/r– G·M²/rs.  When I plug in the numbers…  That’s interesting.”

Vinnie’s known me long enough to realize “That’s interesting” meant “Whoa, I certainly didn’t expect THAT!

“So what didja expect and whatcha got?”

“What I expected was that lift-it-up work would also be just a small fraction of the infall energy and the rest would go to heat.  What I got for ΔE here was 2.6874×1041 joules, exactly 100% of the input.  I wonder what happens if I use a bigger planet.  Gimme a second … OK, let’s plot a range …  How ’bout that, it’s linear!”ep-es

“Alright, show us!”

All the infall energy goes to move the shell’s combined mass outward to match the expanded size of the event horizon.  I’m amazed that such a simple classical model produces a reasonable result.”

“Like Miss Plenum says, Mr Moire, sometimes the best science comes from surprises.”

“I wouldn’t show it around, Jeremy, except that it’s consistent with Hawking’s quantum-physics result.”

“How’s that?”

“Remember, he showed that a black hole’s temperature varies as 1/M.  We know that temperature is ΔE/ΔS, where the entropy change ΔS varies as .  We’ve just found that ΔE varies as M.  The ΔE/ΔS ratio varies as M/M²=1/M, just like Hawking said.”

Then Jennie got into the conversation.

~~ Rich Olcott