Bridging A Paradox

<chirp, chirp> “Moire here.”

“Hi, Sy. Vinnie. Hey, I’ve been reading through some of your old stuff—”

“That bored, eh?”

“You know it. Anyhow, something just don’t jibe, ya know?”

“I’m not surprised but I don’t know. Tell me about it.”

“OK, let’s start with your Einstein’s Bubble piece. You got this electron goes up‑and‑down in some other galaxy and sends out a photon and it hits my eye and an atom in there absorbs it and I see the speck of light, right?”

“That’s about the size of it. What’s the problem?”

“I ain’t done yet. OK, the photon can’t give away any energy on the way here ’cause it’s quantum and quantum energy comes in packages. And when it hits my eye I get the whole package, right?”

“Yes, and?”

“And so there’s no energy loss and that means 100% efficient and I thought thermodynamics says you can’t do that.”

“Ah, good point. You’ve just described one version of Loschmidt’s Paradox. A lot of ink has gone into the conflict between quantum mechanics and relativity theory, but Herr Johann Loschmidt found a fundamental conflict between Newtonian mechanics, which is fundamental, and thermodynamics, which is also fundamental. He wasn’t talking photons, of course — it’d be another quarter-century before Planck and Einstein came up with that notion — but his challenge stood on your central issue.”

“Goody for me, so what’s the central issue?”

“Whether or not things can run in reverse. A pendulum that swings from A to B also swings from B to A. Planets go around our Sun counterclockwise, but Newton’s math would be just as accurate if they went clockwise. In all his equations and everything derived from them, you can replace +t with ‑t to make run time backwards and everything looks dandy. That even carries over to quantum mechanics — an excited atom relaxes by emitting a photon that eventually excites another atom, but then the second atom can play the same game by tossing a photon back the other way. That works because photons don’t dissipate their energy.”

“I get your point, Newton-style physics likes things that can back up. So what’s Loschmidt’s beef?”

“Ever see a fire unburn? Down at the microscopic level where atoms and photons live, processes run backwards all the time. Melting and freezing and chemical equilibria depend upon that. Things are different up at the macroscopic level, though — once heat energy gets out or randomness creeps in, processes can’t undo by themselves as Newton would like. That’s why Loschmidt stood the Laws of Thermodynamics up against Newton’s Laws. The paradox isn’t Newton’s fault — the very idea of energy was just being invented in his time and of course atoms and molecules and randomness were still centuries away.”

“Micro, macro, who cares about the difference?”

“The difference is that the micro level is usually a lot simpler than the macro level. We can often use measured or calculated micro‑level properties to predict macro‑level properties. Boltzmann started a whole branch of Physics, Statistical Mechanics, devoted to carrying out that strategy. For instance, if we know enough about what happens when two gas molecules collide we can predict the speed of sound through the gas. Our solid‑state devices depend on macro‑level electric and optical phenomena that depend on micro‑level electron‑atom interactions.”

“Statistical?”

“As in, ‘we don’t know exactly how it’ll go but we can figure the odds…‘ Suppose we’re looking at air molecules and the micro process is a molecule moving. It could go left, right, up, down, towards or away from you like the six sides of a die. Once it’s gone left, what are the odds it’ll reverse course?”

“About 16%, like rolling a die to get a one.”

“You know your odds. Now roll that die again. What’s the odds of snake‑eyes?”

“16% of 16%, that’s like 3 outa 100.”

“There’s a kajillion molecules in the room. Roll the die a kajillion times. What are the odds all the air goes to one wall?”

“So close to zero it ain’t gonna happen.”

“And Boltzmann’s Statistical Mechanics explained why not.”

“Knowing about one molecule predicts a kajillion. Pretty good.”

San Francisco’s Golden Gate Bridge, looking South
Photo by Rich Niewiroski Jr. / CC BY 2.5

~~ Rich Olcott

Free Energy, or Not

From: Richard Feder <rmfeder@fortleenj.com>
To: Sy Moire <sy@moirestudies.com>
Subj: Questions

What’s this about “free energy”? Is that energy that’s free to move around anywhere? Or maybe the vacuum energy that this guy said is in the vacuum of space that will transform the earth into a wonderful world of everything for free for everybody forever once we figure out how to handle the force fields and pull energy out of them?


From: Sy Moire <sy@moirestudies.com>
To: Richard Feder <rmfeder@fortleenj.com>

Subj: Re: Questions

Well, Mr Feder, as usual you have a lot of questions all rolled up together. I’ll try to take one at a time.

It’s clear you already know that to make something happen you need energy. Not a very substantial definition, but then energy is an abstract thing it took humanity a couple of hundred years to get our minds around and we’re still learning.

Physics has several more formal definitions for “energy,” all clustered around the ability to exert force to move something and/or heat something up. The “and/or” is the kicker, because it turns out you can’t do just the moving. As one statement of the Second Law of Thermodynamics puts it, “There are no perfectly efficient processes.”

For example, when your car’s engine burns a few drops of gasoline in the cylinder, the liquid becomes a 22000‑times larger volume of hot gas that pushes the piston down in its power stroke to move the car forward. In the process, though, the engine heats up (wasted energy), gases exiting the cylinder are much hotter than air temperature (more wasted energy) and there’s friction‑generated heat all through the drive train (even more waste). Improving the drive train’s lubrication can reduce friction, but there’s no way to stop energy loss into heated-up combustion product molecules.

Two hundred years of effort haven’t uncovered a usable loophole in the Second Law. However, we have been able to quantify it. Especially for practically important chemical reactions, like burning gasoline, scientists can calculate how much energy the reaction product molecules will retain as heat. The energy available to do work is what’s left.

For historical reasons, the “available to do work” part is called “free energy.” Not free like running about like ball lightning, but free in the sense of not being bound up in jiggling heated‑up molecules.

Vacuum energy is just the opposite of free — it’s bound up in the structure of space itself. We’ve known for a century that atoms waggle back and forth within their molecules. Those vibrations give rise to the infrared spectra we use for remote temperature sensing and for studying planetary atmospheres. One of the basic results of quantum mechanics is that there’s a minimum amount of motion, called zero‑point vibration, that would persist even if the molecule were frozen to absolute zero temperature.

There are other kinds of zero‑point motion. We know of two phenomena, the Casimir effect and the Lamb shift, that can be explained by assuming that the electric field and other force fields “vibrate” at the ultramicroscopic scale even in the absence of matter. Not vibrations like going up and down, but like getting more and less intense. It’s possible that the same “vibrations” spark radioactive decay and some kinds of light emission.

Visualize space being marked off with a mesh of cubes. In each cube one or more fields more‑or‑less periodically intensify and then relax. The variation strength and timing are unpredictable. Neighboring squares may or may not sync up and that’s unpredictable, too.

The activity is all governed by yet another Heisenberg’s Uncertainty Principle trade‑off. The stronger the intensification, the less certain we can be about when or where the next one will happen.

What we can say is that whether you look at a large volume of space (even an atom is ultramicroscopicly huge) or a long period of time (a second might as well be a millennium), on the average the intensity is zero. All our energy‑using techniques involve channeling energy from a high‑potential source to a low‑potential sink. Vacuum energy sources are everywhere but so are the sinks and they all flit around. Catching lightning in a jar was easy by comparison.

Regards,
Sy Moire.

~~ Rich Olcott

The Big Chill

Jeremy gets as far as my office door, then turns back. “Wait, Mr Moire, that was only half my question. OK, I get that when you squeeze on a gas, the outermost molecules pick up kinetic energy from the wall moving in and that heats up the gas because temperature measures average kinetic energy. But what about expansion cooling? Those mist sprayers they set up at the park, they don’t have a moving outer wall but the air around them sure is nice and cool on a hot day.”

“Another classic Jeremy question, so many things packed together — Gas Law, molecular energetics, phase change. One at a time. Gas Law’s not much help, is it?”

“Mmm, guess not. Temperature measures average kinetic energy and the Gas Law equation P·V = n·R·T gives the total kinetic energy for the n amount of gas. Cooling the gas decreases T which should reduce P·V. You can lower the pressure but if the volume expands to compensate you don’t get anywhere. You’ve got to suck energy out of there somehow.”

Illustrations adapted from drawings by Trianna

“The Laws of Thermodynamics say you can’t ‘suck’ heat energy out of anything unless you’ve got a good place to put the heat. The rule is, heat energy travels voluntarily only from warm to cold.”

“But, but, refrigerators and air conditioners do their job! Are they cheating?”

“No, they’re the products of phase change and ingenuity. We need to get down to the molecular level for that. Think back to our helium-filled Mylar balloon, but this time we lower the outside pressure and the plastic moves outward at speed w. Helium atoms hit the membrane at speed v but they’re traveling at only (v-w) when they bounce back into the bulk gas. Each collision reduces the atom’s kinetic energy from ½m·v² down to ½m·(v-w)². Temperature goes down, right?”

“That’s just the backwards of compression heating. The compression energy came from outside, so I suppose the expansion energy goes to the outside?”

“Well done. So there has to be something outside that can accept that heat energy. By the rules of Thermodynamics, that something has to be colder than the balloon.”

“Seriously? Then how do they get those microdegree above absolute zero temperatures in the labs? Do they already have an absolute-zero thingy they can dump the heat to?”

“Nope, they get tricky. Suppose a gas in a researcher’s container has a certain temperature. You can work that back to average molecular speed. Would you expect all the molecules to travel at exactly that speed?”

“No, some of them will go faster and some will go slower.”

“Sure. Now suppose the researcher uses laser technology to remove all the fast-moving molecules but leave the slower ones behind. What happens to the average?”

“Goes down, of course. Oh, I see what they did there. Instead of the membrane transmitting the heat away, ejected molecules carry it away.”

“Yup, and that’s the key to many cooling techniques. Those cooling sprays, for instance, but a question first — which has more kinetic energy, a water droplet or the droplet’s molecules when they’re floating around separately as water vapor?”

“Lessee… the droplet has more mass, wait, the molecules total up to the same mass so that’s not the difference, so it’s droplet velocity squared versus lots of little velocity-squareds … I’ll bet on the droplet.”

“Sorry, trick question. I left out something important — the heat of vaporization. Water molecules hold pretty tight to each other, more tightly in fact than most other molecular substances. You have to give each molecule a kick to get it away from its buddies. That kick comes from other molecules’ kinetic energy, right? Oh, and one more thing — the smaller the droplet, the easier for a molecule to escape.”

“Ah, I see where this is going. The mist sprayer’s teeny droplets evaporate easy. The droplets are at air temperature, so when a molecule breaks free some neighbor’s kinetic energy becomes what you’d expect from air temperature, minus break-free energy. That lowers the average for the nearby air molecules. They slow their neighbors. Everything cools down. So that’s how sprays and refrigerators and such work?”

“That’s the basic principle.”

“Cool.”

~ Rich Olcott

Thanks to Mitch Slevc for the question that led to this post.

At The Old Curiosity Shop

An imposing knock at the door, both impetuous and imperious.  I figured it for an Internet denizen.  “C’mon in, the door’s open.”

“You’re Moire?”

“I am.  And you are..?”

“The name’s Feder, Richard Feder, from Fort Lee, NJ.  I’m a stand-in for some of your commenters.”

“Ah, the post of business past.  You have a question?”

“Yeah.  How come hot water can freeze faster than cold water?”

“That’s really two questions. The first is, ‘Can hot water freeze faster than cold water?’ and the second is, ‘How come?‘  To the surprise of a lot of physicists, the experimental answer to the first question is, ‘Yes, sometimes.‘  But it’s only sometimes and even that depends on how you define freeze.”

“What’s to define?  Frozen is frozen.”

“Not so fast.  Are we talking surface ice formation, or complete solidification, or maybe just descent to freezing temperature?  Three very different processes.  There’s multiple reports of anomalous behavior for each one, but many of the reports have been contested by other researchers.  Lots of explanations, too.  The situation reminds me of Anne’s Elephant.”

“Why an elephant?  And who’s Anne?”

“Remember the old story about the blind men trying to figure out an elephant?  The guy touching its trunk said it’s a snake, the one at its side said it’s a wall, the dude at its leg said it’s a tree, and so on?  The descriptions differed because each observer had limited knowledge of something complicated.  This chilled-water issue is like that — irreproducible experiments because of uncontrolled unknown variables, mostly maybes on the theory side because we’re still far from a fundamental understanding.”

“Who’s Anne?”

“Anne is … an experience.  I showed her how the notion of Entropy depends on how you look at it.  Scientists have looked at this paradoxical cooling effect pretty much every way you can think of, trying to rule out various hypotheses.  Different teams have both found and not found the anomaly working with distilled water and with tap water, large amounts and small, in the open air and in sealed containers, in glass or metal containers, with and without stirring, with various pre-washing regimens or none, using a variety of initial and final temperatures.  They’ve clocked the first appearance of surface ice and complete opacity of the bulk.  They’ve tracked temperature’s trajectory in the middle of the container or near its wall… you name it.  My favorite observation was the 20th Century’s first-published one — in 1963 Erasto Mpemba noticed the effect while preparing ice cream.”

“What flavor?  Never mind.  Is there a verdict?”

“Vaguely.  Once you get approximately the right conditions, whether or not you see the effect seems to be a matter of chance.  The more sophisticated researchers have done trials in the hundreds and then reported percentages, rather than just ‘we see it’ or not.  Which in itself is interesting.”many elephants

“How’s that?”

“Well, to begin with, the percents aren’t zero.  That answers your first question — warm water sometimes does freeze faster than cold.  Better yet, the variability tells us that the answer to your second question is at the nanoscopic level.  Macroscopic processes, even chemical ones, have statistics that go the same way all the time.  Put a lit match to gasoline in air, you’ll always get a fire.  But if you set out 100 teaspoons of water under certain conditions and 37 of them freeze and the others don’t, something very unusual must be going on that starts with just a few molecules out of the 10²³ in those teaspoons.”

“Weird odds.”

This experiment’s even more interesting.  You’ve got two bottles of water.  You heat up bottle A and let it cool to room temperature.  B‘s been at room temperature all along.  You put ’em both in the fridge and track their temperatures.  A cools quicker.”

“That’s where I came in.”

“Both start at the same temperature, finish at the same temperature, and their Joules-per-second energy-shedding rates should be the same.  A cools in less time so A releases less heat.  Entropy change is released heat energy divided by temperature.  Somehow, bottle A went into the fridge with less entropy than B had.  Why?  We don’t really know.”

~~ Rich Olcott

  • – Thanks to Ilias Tirovolas, whose paper inspired this post.

Thoughts of Chair-man Moire

My apples and orange peels question, Sy,  isn’t that the same as Jeremy’s?  What’s the connection between heat capacity and counting?”

“You’re right, Anne.  Hmm.  Say, Al, all your coffee shop tables came with four chairs apiece, right?”

“Yup, four-tops every one, even in the back room.”

“You neaten them all up, four to a table, in the morning?”

“The night before.  There’s never time in the morning, customers demand coffee first thing.”

“But look, we’ve got six people seated at this table.  Where’d the extra chairs come from?”

“Other tables, of course.  Is this going somewhere?”

“Almost there.  So in fact the state of the room at any time will have some random distribution of chairs to tables.  You know on the average there’ll be four at a table, but you don’t know the actual distribution until you look, right?”

“Hey, we’re counting again.  You’re gonna say that’s about entropy ’cause the difference between four at a table and some other number is all random and there’s some formula to calculate entropy from that.”elephants and chairs

“True, Vinnie, but we’re about to take the next step.  How did these chairs wind up around this table?”

“We pulled them over, Mr. Moire.”

“My point is, Jeremy, we spent energy to get them here.  The more chairs that are out of position — ”

“The higher the entropy, but also the more energy went into the chairs.  It’s like that heat capacity thing we started with, the energy that got absorbed rather than driving the steam engine.”

“Awright, Anne!” from Jeremy <Jennie bristles a bit>, “and if all the chairs are in Al’s overnight position it’s like absolute zero.  Hey, temperature is average kinetic energy per particle so can we say that the more often a chair gets moved it’s like hotter?”

Jennie breaks in.  “Not a bit of it, Jeremy!  The whole metaphor’s daft.  We know temperature change times heat capacity equals the energy absorbed, right, and we’ve got a link between energy absorption and entropy, right, but what about if at the end of the day all the chairs accidentally wind up four at a table?  Entropy change is zero, right, but customers expended energy moving chairs about all day and Al’s got naught to set straight.”

“Science in action, I love it!  Anne and Jeremy, you two just bridged a gap it took Science a century to get across.  Carnot started us on entropy’s trail in 1824 but scientists in those days weren’t aware of matter’s atomic structure.  They knew that stuff can absorb heat but they had no inkling what did the absorbing or how that worked.  Thirty years later they understood simple gases better and figured out that average kinetic energy per particle bit.  But not until the 1920s did we have the quantum mechanics to show how parts of vibrating molecules can absorb heat energy stepwise like a table ‘absorbing’ chairs.  Only then could we do Vinnie’s state-counting to calculate entropies.”

“Yeah, more energy, spread across more steps, hiding more details we don’t know behind an average, more entropy.  But what about Jennie’s point?”

“Science is a stack of interconnected metaphors, Vinnie.  Some are better than others.  The trick is attending to the boundaries where they stop being valid.  Jennie’s absolutely correct that my four-chair argument is only a cartoon for illustrating stepwise energy accumulation.  If Al had a billion tables instead of a dozen or so, the odds on getting everything back to the zero state would disappear into rounding error.”

“How does black hole entropy play into this, Sy?”TSE classical vs BH

“Not very well, actually.  Oh, sure, the two systems have similar structures.  They’ve each got three inter-related central quantities constrained by three laws.  Here, I’ve charted them out on Old Reliable.”

“OK, their Second and Third Laws look pretty much the same, but their First Laws don’t match up.”

“Right, Al.  And even Bekenstein pointed out inconsistencies between classic thermodynamic temperature and what’s come to be called Hawking temperature.  Hawking didn’t agree.  The theoreticians are still arguing.  Here’s a funny one — if you dig deep enough, both versions of the First Law are the same, but the Universe doesn’t obey it.”

“That’s it, closing time.  Everybody out.”

~~ Rich Olcott

Taming The Elephant

Suddenly they were all on the attack.  Anne got in the first lick.  “C’mon, Sy, you’re comparing apples and orange peel.  Your hydrogen sphere would be on the inside of the black hole’s event horizon, and Jeremy’s virtual particles are on the outside.”

[If you’ve not read my prior post, do that now and this’ll make more sense.  Go ahead, I’ll wait here.]white satin and 5 elephantsJennie’s turn — “Didn’t the chemists define away a whole lot of entropy when they said that pure elements have zero entropy at absolute zero temperature?”

Then Vinnie took a shot.  “If you’re counting maybe-particles per square whatever for the surface, shouldn’t you oughta count maybe-atoms or something per cubic whatever for the sphere?”

Jeremy posed the deepest questions. “But Mr Moire, aren’t those two different definitions for entropy?  What does heat capacity have to do with counting, anyhow?”

Al brought over mugs of coffee and a plate of scones.  “This I gotta hear.”

“Whew, but this is good ’cause we’re getting down to the nub.  First to Jennie’s point — Under the covers, Hawking’s evaluation is just as arbitrary as the chemists’.  Vinnie’s ‘whatever’ is the Planck length, lP=1.616×10-35 meter.  It’s the square root of such a simple combination of fundamental constants that many physicists think that lP2=2.611×10-70 m², is the ‘quantum of area.’  But that’s just a convenient assumption with no supporting evidence behind it.”

“Ah, so Hawking’s ABH=4πrs2 and SBH=ABH/4 formulation with rs measured in Planck-lengths, just counts the number of area-quanta on the event horizon’s surface.”

“Exactly, Jennie.  If there really is a least possible area, which a lot of physicists doubt, and if its size doesn’t happen to equal lP2, then the black hole entropy gets recalculated to match.”

“So what’s wrong with cubic those-things?”

“Nothing, Vinnie, except that volumes measured in lP3 don’t apply to a black hole because the interior’s really four-dimensional with time scrambled into the distance formulas.  Besides, Hawking proved that the entropy varies with half-diameter squared, not half-diameter cubed.”

“But you could still measure your hydrogen sphere with them and that’d get rid of that 1033 discrepancy between the two entropies.”

“Not really, Vinnie.  Old Reliable calculated solid hydrogen’s entropy for a certain mass, not a volume.”

“Hawking can make his arbitrary choice, Sy, he’s Hawking, but that doesn’t let the chemists off the scaffold.  How did they get away with arbitrarily defining a zero for entropy?”

“Because it worked, Jennie.  They were only concerned with changes — the difference between a system’s state at the end of a process, versus its state at the beginning.  It was only the entropy difference that counted, not its absolute value.”

“Hey, like altitude differences in potential energy.”

“Absolutely, Vinnie, and that’ll be important when we get to Jeremy’s question.  So, Jennie, if you’re only interested in chemical reactions and if it’s still in the 19th Century and the world doesn’t know about isotopes yet, is there a problem with defining zero entropy to be at a convenient set of conditions?”

“Well, but Vinnie’s Second Law says you can never get down to absolute zero so that’s not convenient.”

“Good point, but the Ideal Gas Law and other tools let scientists extrapolate experimentally measured properties down to extremely low temperatures.  In fact, the very notion of absolute zero temperature came from experiments where the volume of a  hydrogen or helium gas sample appears to decrease linearly towards zero at that temperature, at least until the sample condenses to a liquid.  With properly calibrated thermometers, physical chemists knocked themselves out measuring heat capacities and entropies at different temperatures for every substance they could lay hands on.”

“What about isotopes, Mr Moire?  Isn’t chlorine’s atomic weight something-and-a-half so there’s gotta be several of kinds of chlorine atoms so any sample you’ve got is a mixture and that’s random and that has to have a non-zero entropy even at absolute zero.”

“It’s 35.4, two stable isotopes, Jeremy, but we know how to account for entropy of mixing and anyway, the isotope mix rarely changes in chemical processes.”

“But my apples and orange peels, Sy — what does the entropy elephant do about them?”

~~ Rich Olcott

The Battle of The Entropies

(the coffee-shop saga continues)  “Wait on, Sy, a black hole is a hollow sphere?”

I hadn’t noticed her arrival but there was Jennie, standing by Vinnie’s table and eyeing Jeremy who was sill eyeing Anne in her white satin.white satin and 2 elephants“That’s not quite what I said, Jennie.  Old Reliable’s software and and I worked up a hollow-shell model and to my surprise it’s consistent with one of Stephen Hawking’s results.  That’s a long way from saying that’s what a black hole is.”

“But you said some physicists say that.  Have they aught to stand on?”

“Sort of.  It’s a perfect case of ‘depends on where you’re standing.'”

Vinnie looked up.  “It’s frames again, ain’t it?”

“With black holes it’s always frames, Vinnie.  Hey, Jeremy, is a black hole something you could stand on?”

“Nosir, we said the hole’s event horizon is like Earth’s orbit, just a mathematical marker.  Except for the gravity and  the  three  Perils  Jennie and you and me talked about, I’d slide right through without feeling anything weird, right?”

“Good memory and just so.  In your frame of reference there’s nothing special about that surface — you wouldn’t experience scale changes in space or time when you encounter it.  In other frames, though, it’s special.  Suppose we’re standing a thousand miles away from a solar-size black hole and Jeremy throws a clock and a yardstick into it.  What would we see?”

“This is where those space compression and time dilation effects happen, innit?”

“You bet, Jennie.  Do you remember the formula?”

“I wrote it in my daybook … Ah, here it is —Schwarzchild factorMy notes say D is the black hole’s diameter and d is another object’s distance from its center.  One second in the falling object’s frame would look like f seconds to us.  But one mile would look like 1/f miles.  The event horizon is where d equals the half-diameter and f goes infinite.  The formula only works where the object stays outside the horizon.”

“And as your clock approaches the horizon, Jeremy…?”

“You’ll see my clock go slower and slower until it sto —.  Oh.  Oh!  That’s why those physicists think all the infalling mass is at the horizon, the stuff falls towards it forever and never makes it through.”

“Exactly.”

“Hey, waitaminute!  If all that mass never gets inside, how’d the black hole get started in the first place?”

“That’s why it’s only some physicists, Vinnie.  The rest don’t think we understand the formation process well enough to make guesses in public.”

“Wait, that formula’s crazy, Sy.  If something ever does get to where d is less than D/2, then what’s inside the square root becomes negative.  A clock would show imaginary time and a yardstick would go imaginary, too.  What’s that about?”

“Good eye, Anne, but no worries, the derivation of that formula explicitly assumes a weak gravitational field.  That’s not what we’ve got inside or even close to the event horizon.”

“Mmm, OK, but I want to get back to the entropy elephant.  Does black hole entropy have any connection to the other kinds?”

Strutural, mostly.  The numbers certainly don’t play well together.  Here’s an example I ran up recently on Old Reliable.  Say we’ve got a black hole twice the mass of the Sun, and it’s at the Hawking temperature for its mass, 12 billionths of a Kelvin.  Just for grins, let’s say it’s made of solid hydrogen.  Old Reliable calculated two entropies for that thing, one based on classical thermodynamics and the other based on the Bekenstein-Hawking formulation.”Entropy calculations“Wow, Old Reliable looks up stuff and takes care of unit conversions automatically?”

“Slick, eh, Jeremy?  That calculation up top for Schem is classical chemical thermodynamics.  A pure sample of any element at absolute zero temperature is defined to have zero entropy.  Chemical entropy is cumulative heat capacity as the sample warms up.  The Hawking temperature is so close to zero I could treat heat capacity as a constant.

“In the middle section I calculated the object’s surface area in square Planck-lengths lP², and in the bottom section I used Hawking’s formula to convert area to B-H entropy, SBH.  They disagree by a factor of 1033.”

A moment of shocked silence, and then…

~~ Rich Olcott

Red Harvest

<continued> Al’s coffee shop was filling up as word got around about Anne in her white satin.  I saw a few selfie-takers in the physics crowd surreptitiously edge over to get her into their background.  She was busy thinking so she didn’t notice.  “The entropy-elephant picture is starting to come together, Sy.  We started out with entropy measuring accumulated heat capacity in a steam engine.”

“That’s where Carnot started, yes.”

“But when Jeremy threw that hot rock into the black hole” <several in the astronomy crew threw startled looks at Jeremy>, “its heat energy added to the black hole’s mass, but it should have added to the black hole’s entropy, too.  ‘Cause of Vinnie’s Second Law.”white satin and black hole 3

Vinnie looked up.  “Ain’t my Second Law, it’s thermodynamics’ Second Law.  Besides, my version was ‘energy’s always wasted.’  Sy’s the one who turned that into ‘entropy always increases.'”

“So anyway, black holes can’t have zero entropy like people used to think.  But if entropy also has to do with counting possibilities, than how does that apply to black holes?  They have only one state.”

“That’s where Hawking got subtle.  Jeremy, we’ve talked about how the black hole’s event horizon is a mathematical abstraction, infinitely thin and perfectly smooth and all that.”

“Yessir.”

“Hawking moved one step away from that abstraction.  In essence he said the  event horizon is surrounded by a thin shell of virtual particles.  Remember them, Jeremy?”

“Uh-huh, that was on my quest to the event horizon.  Pairs of equal and opposite virtual particles randomly appear and disappear everywhere in space and because they appear together they’re entangled and if one of them dips into the event horizon then it doesn’t annihilate its twin which — Oh!  Random!  So what’s inside the event horizon may have only one state, so far as we know, but right outside the horizon any point may or may not be hosting, can I call it an orphan particle?  I’ll bet that uncertainty give rise to the entropy, right?”

<finger-snaps of approval from the physics side of the room>

“Well done, Jeremy!  ‘Orphan’ isn’t the conventional term but it gets the idea across.”

“Wait, Sy.  You mentioned that surface area and entropy go together and now I see why.  The larger the area, the more room there is for those poor orphans.  When Jeremy’s rock hit the event horizon and increased the black hole’s mass, did the surface area increase enough to allow for the additional entropy?” <more finger-snapping>

“Sure did, Anne.  According to Hawking’s calculation, it grew by exactly the right amount.  Mass and area both grow as the square of the diameter.”

“How come not the radius?”

“Well , Vinnie, the word ‘radius‘ is tricky when you’re discussing black holes.  The event horizon is spherical and has a definite diameter — you could measure it from the outside.  But the sphere’s radius extends down to the singularity and is kind of infinite and isn’t even strictly speaking a distance.  Space-time is twisted in there, remember, and that radial vector is mostly time near its far end.  On the other hand, you could use ‘radius‘ to mean ‘half the diameter‘ and you’d be good for calculating effects outside the event horizon.”

“OK, that’s the entropy-area connection, but how does temperature tie in with surface gravity?”

“They’re both inversely dependent on the black hole’s mass.  Let’s take surface gravity first, and here when I say ‘r‘ I’m talking ‘half-diameter,‘ OK?”

“Sure.”

“Good.  Newton taught us that an object with mass M has a gravitational attraction proportional to M/r².  That still holds if you’re not inside the event horizon.  Now, the event horizon’s r is also proportional to the object’s mass so you’ve got M/M² which comes to 1/M.  With me?”

“Yeah.”

“Hawking used quantum physics to figure the temperature thing, but here’s a sloppy short-cut.  Anne, remember how we said that entropy is approximately heat capacity divided by temperature?”

“Mm-hmm.”

“The shell’s energy is mostly heat and proportional to M.  We’ve seen the shell’s entropy is proportional to .  The temperature is heat divided by entropy.  That’s proportional to M/M² which is the same 1/M as surface gravity.” <boos from all sides>. “Hey, I said it was sloppy.”

~~ Rich Olcott

Rockin’ Round The Elephant

<continued…>  “That’s what who said?  And why’d he say that?”

“That’s what Hawking said, Al.  He’s the guy who first applied thermodynamic analysis to black holes.  Anyone happen to know the Three Laws of Thermodynamics?”

Vinnie pipes up from his table by the coffee shop door.  “You can’t win.  You can’t even break even.  But you’ll never go broke.”

“Well, that’s one version, Vinnie, but keep in mind all three of those focus on energy.  The First Law is Conservation of Energy—no process can create or destroy energy, only  transform it, so you can’t come out ahead.  The Second Law is really about entropy—”

“Ooo, the elephant!”white satin and black hole 2

“Right, Anne.  You usually see the Second Law stated in terms of energy efficiency—no process can convert energy to another form without wasting some of it. No breaking even.  But an equivalent statement of that same law is that any process must increase the entropy of the Universe.”

“The elephant always gets bigger.”

“Absolutely.  When Bekenstein and Hawking thought about what would happen if a black hole absorbed more matter, worst case another black hole, they realized that the black hole’s surface area had to follow the same ‘Never decrease‘ rule.”

“Oh, that Hawking!  Hawking radiation Hawking!  The part I didn’t understand, well one of the parts, in that “Black Holes” Wikipedia article!  It had to do with entangled particles, didn’t it?”

“Just caught up with us, eh, Jeremy?  Yes, Stephen Hawking.  He and Jacob Bekenstein found parallels between what we can know about black holes on the one hand and thermodynamic quantities on the other.  Surface area and entropy, like we said, and a black hole’s mass acts mathematically like energy in thermodynamics.  The correlations were provocative ”

“Mmm, provocative.”

“You like that word, eh, Anne?  Physicists knew that Bekenstein and Hawking had a good analogy going, but was there a tight linkage in there somewhere?  It seemed doubtful.”

“Nothin’ to count.”

“Wow, Vinnie.  You’ve been reading my posts?”

“Sure, and I remember the no-hair thing.  If the only things the Universe can know about a black hole are its mass, spin and charge, then there’s nothing to figure probabilities on.”

“Exactly.  The logic sequence went, ‘Entropy is proportional to the logarithm of state count, there’s only one state, log(1) equals zero,  so the entropy is zero.’  But that breaks the Third Law.  Vinnie’s energy-oriented Third Law says that no object can cool to absolute zero temperature.  But an equivalent statement is that no object can have zero entropy.”

“So there’s something wrong with black hole theory, huh?”

“Which is where our guys started, Vinnie.  Being physicists, they said, ‘Suppose you were to throw an object into a black hole.  What would change?’

“Its mass, for one.”

“For sure, Jeremy.  Anything else?”

“It might not change the spin, if you throw right.”

“Spoken like a trained baseball pitcher.  Turns out its mass governs pretty much everything about a black hole, including its temperature but not spin or charge.  Once you know the mass you can calculate its entropy, diameter, surface area, surface gravity, maximum spin, all of that.  Weird, though, you can’t easily calculate its volume or density — spatial distortion gets in the way.”

“So what happens to all those things when the mass increases?”

“As you might expect, they change.  What’s interesting is how each of them change and how they’re linked together.  Temperature, for instance, is inversely proportional to the mass and vice-versa.  Suppose, Jeremy, that you threw two big rocks, both the same size, into a black hole.  The first rock is at room temperature and the other’s a really hot one, say at a million degrees.   What would each do?”

“The first one adds mass so from what you said it’d drop the temperature.  The second one has the same mass, so I don’t see, wait, temperature’s average kinetic energy so the hot rock has more energy than the other one and Einstein says that energy and mass are the same thing so the black hole gets more mass from the hot rock than from the cold one so its temperature goes down … more?  Really?”

“Yup.  Weird, huh?”

“How’s that work?”

“That’s what they asked.”

~~ Rich Olcott

Enter the Elephant, stage right

Anne?”

“Mm?”

“Remember when you said that other reality, the one without the letter ‘C,’  felt more probable than this one?”

“Mm-mm.”

“What tipped you off?”

Now you’re asking?”

“I’m a physicist, physicists think about stuff.  Besides, we’ve finished the pizza.”

<sigh> “This conversation has gotten pretty improbable, if you ask me.  Oh, well.  Umm, I guess it’s two things.  The more-probable realities feel denser somehow, and more jangly. What got you on this track?”

“Conservation of energy.  Einstein’s E=mc² says your mass embodies a considerable amount of energy, but when you jump out of this reality there’s no flash of light or heat, just that fizzing sound.  When you come back, no sudden chill or things falling down on us, just the same fizzing.  Your mass-energy that has to go to or come from somewhere.  I can’t think where or how.”

“I certainly don’t know, I just do it.  Do you have any physicist guesses?”

“Questions first.”

“If you must.”

“It’s what I do.  What do you perceive during a jump?  Maybe something like falling, or heat or cold?”

“There’s not much ‘during.’  It’s not like I go through a tunnel, it’s more like just turning around.  What I see goes out of focus briefly.  Mostly it’s the fizzy sound and I itch.”

“Itch.  Hmm…  The same itch every jump?”

“That’s interesting.  No, it’s not.  I itch more if I jump to a more-probable reality.”

Very interesting.  I’ll bet you don’t get that itch if you’re doing a pure time-hop.”

“You’re right!  OK, you’re onto something, give.”

“You’ve met one of my pet elephants.”

“Wha….??”White satin and elephant

“A deep question that physics has been nibbling around for almost two centuries.  Like the seven blind men and the elephant.  Except the physicists aren’t blind and the elephant’s pretty abstract.  Ready for a story?”

“Pour me another and I will be.”

“Here you go.  OK, it goes back to steam engines.  People were interested in getting as much work as possible out of each lump of coal they burned.  It took a couple of decades to develop good quantitative concepts of energy and work so they could grade coal in terms of energy per unit weight, but they got there.  Once they could quantify energy, they discovered that each material they measured — wood, metals, water, gases — had a consistent heat capacity.  It always took the same amount of energy to raise its temperature across a given range.  For a kilogram of water at 25°C, for instance, it takes one kilocalorie to raise its temperature to 26°C.  Lead and air take less.”

“So where’s the elephant come in?”

“I’m getting there.  We started out talking about steam engines, remember?  They work by letting steam under pressure push a piston through a cylinder.  While that’s happening, the steam cools down before it’s puffed out as that classic old-time Puffing Billy ‘CHUFF.’  Early engine designers thought the energy pushing the piston just came from trading off pressure for volume.  But a guy named Carnot essentially invented thermodynamics when he pointed out that the cooling-down was also important.  The temperature drop meant that heat energy stored in the steam must be contributing to the piston’s motion because there was no place else for it to go.”

“I want to hear about the elephant.”

“Almost there.  The question was, how to calculate the heat energy.”

“Why not just multiply the temperature change by the heat capacity?”

“That’d work if the heat capacity were temperature-independent, which it isn’t.  What we do is sum up the capacity at each intervening temperature.  Call the sum ‘elephant’ though it’s better known as Entropy.  Pressure, Volume, Temperature and Entropy define the state of a gas.  Using those state functions all you need to know is the working fluid’s initial and final state and you can calculate your engine.  Engineers and chemists do process design and experimental analysis using tables of reported state function values for different substances at different temperatures.”

“Do they know why heat capacity changes?”

“That took a long time to work out, which is part of why entropy’s an elephant.  And you’ve just encountered the elephant’s trunk.”

“There’s more elephant?”

“And more of this.  Want a refill?”

~~ Rich Olcott