Red Harvest

<continued> Al’s coffee shop was filling up as word got around about Anne in her white satin.  I saw a few selfie-takers in the physics crowd surreptitiously edge over to get her into their background.  She was busy thinking so she didn’t notice.  “The entropy-elephant picture is starting to come together, Sy.  We started out with entropy measuring accumulated heat capacity in a steam engine.”

“That’s where Carnot started, yes.”

“But when Jeremy threw that hot rock into the black hole” <several in the astronomy crew threw startled looks at Jeremy>, “its heat energy added to the black hole’s mass, but it should have added to the black hole’s entropy, too.  ‘Cause of Vinnie’s Second Law.”white satin and black hole 3

Vinnie looked up.  “Ain’t my Second Law, it’s thermodynamics’ Second Law.  Besides, my version was ‘energy’s always wasted.’  Sy’s the one who turned that into ‘entropy always increases.'”

“So anyway, black holes can’t have zero entropy like people used to think.  But if entropy also has to do with counting possibilities, than how does that apply to black holes?  They have only one state.”

“That’s where Hawking got subtle.  Jeremy, we’ve talked about how the black hole’s event horizon is a mathematical abstraction, infinitely thin and perfectly smooth and all that.”

“Yessir.”

“Hawking moved one step away from that abstraction.  In essence he said the  event horizon is surrounded by a thin shell of virtual particles.  Remember them, Jeremy?”

“Uh-huh, that was on my quest to the event horizon.  Pairs of equal and opposite virtual particles randomly appear and disappear everywhere in space and because they appear together they’re entangled and if one of them dips into the event horizon then it doesn’t annihilate its twin which — Oh!  Random!  So what’s inside the event horizon may have only one state, so far as we know, but right outside the horizon any point may or may not be hosting, can I call it an orphan particle?  I’ll bet that uncertainty give rise to the entropy, right?”

<finger-snaps of approval from the physics side of the room>

“Well done, Jeremy!  ‘Orphan’ isn’t the conventional term but it gets the idea across.”

“Wait, Sy.  You mentioned that surface area and entropy go together and now I see why.  The larger the area, the more room there is for those poor orphans.  When Jeremy’s rock hit the event horizon and increased the black hole’s mass, did the surface area increase enough to allow for the additional entropy?” <more finger-snapping>

“Sure did, Anne.  According to Hawking’s calculation, it grew by exactly the right amount.  Mass and area both grow as the square of the diameter.”

“How come not the radius?”

“Well , Vinnie, the word ‘radius‘ is tricky when you’re discussing black holes.  The event horizon is spherical and has a definite diameter — you could measure it from the outside.  But the sphere’s radius extends down to the singularity and is kind of infinite and isn’t even strictly speaking a distance.  Space-time is twisted in there, remember, and that radial vector is mostly time near its far end.  On the other hand, you could use ‘radius‘ to mean ‘half the diameter‘ and you’d be good for calculating effects outside the event horizon.”

“OK, that’s the entropy-area connection, but how does temperature tie in with surface gravity?”

“They’re both inversely dependent on the black hole’s mass.  Let’s take surface gravity first, and here when I say ‘r‘ I’m talking ‘half-diameter,‘ OK?”

“Sure.”

“Good.  Newton taught us that an object with mass M has a gravitational attraction proportional to M/r².  That still holds if you’re not inside the event horizon.  Now, the event horizon’s r is also proportional to the object’s mass so you’ve got M/M² which comes to 1/M.  With me?”

“Yeah.”

“Hawking used quantum physics to figure the temperature thing, but here’s a sloppy short-cut.  Anne, remember how we said that entropy is approximately heat capacity divided by temperature?”

“Mm-hmm.”

“The shell’s energy is mostly heat and proportional to M.  We’ve seen the shell’s entropy is proportional to .  The temperature is heat divided by entropy.  That’s proportional to M/M² which is the same 1/M as surface gravity.” <boos from all sides>. “Hey, I said it was sloppy.”

~~ Rich Olcott

Rockin’ Round The Elephant

<continued…>  “That’s what who said?  And why’d he say that?”

“That’s what Hawking said, Al.  He’s the guy who first applied thermodynamic analysis to black holes.  Anyone happen to know the Three Laws of Thermodynamics?”

Vinnie pipes up from his table by the coffee shop door.  “You can’t win.  You can’t even break even.  But you’ll never go broke.”

“Well, that’s one version, Vinnie, but keep in mind all three of those focus on energy.  The First Law is Conservation of Energy—no process can create or destroy energy, only  transform it, so you can’t come out ahead.  The Second Law is really about entropy—”

“Ooo, the elephant!”white satin and black hole 2

“Right, Anne.  You usually see the Second Law stated in terms of energy efficiency—no process can convert energy to another form without wasting some of it. No breaking even.  But an equivalent statement of that same law is that any process must increase the entropy of the Universe.”

“The elephant always gets bigger.”

“Absolutely.  When Bekenstein and Hawking thought about what would happen if a black hole absorbed more matter, worst case another black hole, they realized that the black hole’s surface area had to follow the same ‘Never decrease‘ rule.”

“Oh, that Hawking!  Hawking radiation Hawking!  The part I didn’t understand, well one of the parts, in that “Black Holes” Wikipedia article!  It had to do with entangled particles, didn’t it?”

“Just caught up with us, eh, Jeremy?  Yes, Stephen Hawking.  He and Jacob Bekenstein found parallels between what we can know about black holes on the one hand and thermodynamic quantities on the other.  Surface area and entropy, like we said, and a black hole’s mass acts mathematically like energy in thermodynamics.  The correlations were provocative ”

“Mmm, provocative.”

“You like that word, eh, Anne?  Physicists knew that Bekenstein and Hawking had a good analogy going, but was there a tight linkage in there somewhere?  It seemed doubtful.”

“Nothin’ to count.”

“Wow, Vinnie.  You’ve been reading my posts?”

“Sure, and I remember the no-hair thing.  If the only things the Universe can know about a black hole are its mass, spin and charge, then there’s nothing to figure probabilities on.”

“Exactly.  The logic sequence went, ‘Entropy is proportional to the logarithm of state count, there’s only one state, log(1) equals zero,  so the entropy is zero.’  But that breaks the Third Law.  Vinnie’s energy-oriented Third Law says that no object can cool to absolute zero temperature.  But an equivalent statement is that no object can have zero entropy.”

“So there’s something wrong with black hole theory, huh?”

“Which is where our guys started, Vinnie.  Being physicists, they said, ‘Suppose you were to throw an object into a black hole.  What would change?’

“Its mass, for one.”

“For sure, Jeremy.  Anything else?”

“It might not change the spin, if you throw right.”

“Spoken like a trained baseball pitcher.  Turns out its mass governs pretty much everything about a black hole, including its temperature but not spin or charge.  Once you know the mass you can calculate its entropy, diameter, surface area, surface gravity, maximum spin, all of that.  Weird, though, you can’t easily calculate its volume or density — spatial distortion gets in the way.”

“So what happens to all those things when the mass increases?”

“As you might expect, they change.  What’s interesting is how each of them change and how they’re linked together.  Temperature, for instance, is inversely proportional to the mass and vice-versa.  Suppose, Jeremy, that you threw two big rocks, both the same size, into a black hole.  The first rock is at room temperature and the other’s a really hot one, say at a million degrees.   What would each do?”

“The first one adds mass so from what you said it’d drop the temperature.  The second one has the same mass, so I don’t see, wait, temperature’s average kinetic energy so the hot rock has more energy than the other one and Einstein says that energy and mass are the same thing so the black hole gets more mass from the hot rock than from the cold one so its temperature goes down … more?  Really?”

“Yup.  Weird, huh?”

“How’s that work?”

“That’s what they asked.”

~~ Rich Olcott

Schrödinger’s Elephant

Al’s coffee shop sits right between the Astronomy and Physics buildings, which is good because he’s a big Science fan.  He and Jeremy are in an excited discussion when Anne and I walk in.  “Two croissants, Al, and two coffees, black.”

“Comin’ up, Sy.  Hey, you see the news?  Big days for gravitational astronomy.”

Jeremy breaks in.  “There’s a Nobel Prize been announced —”

“Kip Thorne the theorist and Barry Barish the management guy —”

“and Rainer Weiss the instrumentation wizard —”

“shared the Physics prize for getting LIGO to work —”

“and it saw the first signal of a black hole collision in 2015 —”

“and two more since —”

“and confirmed more predictions from relativity theory —”

“and Italy’s got their Virgo gravitational wave detector up and running —”

“And Virgo and our two LIGOs, —”

“Well, they’re both aLIGOs now, being upgraded and all —”

“all three saw the same new wave —”

“and it’s another collision between black holes with weird masses that we can’t account for.  Who’s the lady?”

“Al, this is Anne.  Jeremy, close your mouth, you’ll catch a fly.”  (Jeremy blushes, Anne twinkles.)  “Anne and I are chasing an elephant.”

“Pleased to meetcha, Anne.  But no livestock in here, Sy, the Health Department would throw a fit!”

I grin.  “That’s exactly what Eddie said.  It’s an abstract elephant, Al.  We’ve been discussing entropy. Which is an elephant because it’s got so many aspects no-one can agree on what it is.  It’s got something to do with heat capacity, something to do with possibilities you can’t rule out, something to do with signals and information.  And Hawking showed that entropy also has something to do with black holes.”

“Which I don’t know much about, fellows, so someone will have to explain.”

Jeremy leaps in.  “I can help with that, Miss Anne, I just wrote a paper on them.”

“Just give us the short version, son, she can ask questions if she wants a detail.”

“Yessir.  OK, suppose you took all the Sun’s mass and squeezed it into a ball just a few miles across.  Its density would be so high that escape velocity is faster than the speed of light so an outbound photon just falls back inward and that’s why it’s black.  Is that a good summary, Mr Moire?”

“Well, it might be good enough for an Internet blog but it wouldn’t pass inspection for a respectable science journal.  Photons don’t have mass so the whole notion of escape velocity doesn’t apply.  You do have some essential elements right, though.  Black holes are regions of extreme mass density, we think more dense than anywhere else in the Universe.  A black hole’s mass bends space so tightly around itself that nearby light waves are forced to orbit its region or even spiral inward.  The orbiting happens right at the black hole’s event horizon, its thin shell that encloses the space where things get really weird.  And Anne, the elephant stands on that shell.”white satin and black hole“Wait, Mr Moire, we said that the event horizon’s just a mathematical construct, not something I could stand on.”

“And that’s true, Jeremy.  But the elephant’s an abstract construct, too.  So abstract we’re still trying to figure out what’s under the abstraction.”

“I’m trying to figure out why you said the elephant’s standing there.”

“Anne, it goes back to the event horizon’s being a mathematical object, not a real one.  Its spherical surface marks the boundary of the ultimate terra incognita.  Lightwaves can’t pass outward from it, nor can anything material, not even any kind of a signal.  For at least some kinds of black hole, physicists have proven that the only things we can know about one are its mass, spin and charge.  From those we can calculate some other things like its temperature, but black holes are actually pretty simple.”

“So?”

“So there’s a collision with Quantum Theory.  One of QT’s fundamental assumptions is that in principle we can use a particle’s current wave function to predict probabilities for its future.  But the wave function information disappears if the particle encounters an event horizon.  Things are even worse if the particle’s entangled with another one.”

“Information, entropy, elephant … it’s starting to come together.”

“That’s what he said.”

~~ Rich Olcott

At The Turn of A Card

Not much going on today.  I’m dealing myself a hand of solitaire when I hear a familiar fizzing sound.  “Hello, Anne.  Good to see you again.”

She’s freshened up that white satin outfit and is looking very good.  “Hello, Sy.  Busy?”

“Not so’s you’d notice it.  What can I do for you?”

“Can’t a girl just drop in when she wants to visit?  Playing with real cards, I see.  That’s good, but your tens and treys are frozen.”white satin and cards

“That’s the way the odds break sometimes.  The elephant‘s in the room.”

Entropy again?  What’s it look like this time?”

“These cards and surprise.  How surprised would you be if I were to draw a queen from the stock pile?”

“No queens showing, so some surprised but not very surprised.”

“You know me, I’m a physicist, we put numbers to things.  So put numbers to the situation.”

<sigh>  “OK, there are 52 cards in the deck and you’ve got … 28 cards in that triangle, so there are 24 left in the stock.  Four of them have to be queens.  Four out of 24 is one out of 6.”

“Or 17%.  And the odds for the queen of hearts?”

“I’m here so it’s 100% until I leave.  Oh, I know, you’re talking about the cards.  One in 24 or 4%.  So I’d be four times as surprised at seeing the heart queen as I would at seeing any of them.  Pooh.”

“Now how about the odds of drawing all four queens?”

“One in 24, times one in 23, times one in 22, times one in 21.  Whatever, it’s a very small number and I’d be very surprised.”

“Well, here’s where we get another look at the elephant.  There’s a definition of entropy that links directly to those percentages AND can handle extremely small ones.  What do you know about logarithms?”

“A little.  I read your   last   series  of  posts.”

“Wonderful, that simplifies things.  Let’s start with strange dissociation thought up by Claude Shannon to whom we owe the entire field of information theory.  His crucial insight was that he had to distinguish between information and meaning.”

“How can they be different?  If I say ‘green’ that means, well, green.”

“It’s all about context.  If you’re telling me what color something is, saying ‘green’ is telling me that the thing isn’t white or red or any of the other umm, nine colors I know the names of.  But if you’re telling me someone is really inexperienced then I know not to trust them with a complicated task that has to be done right the first time.  From Shannon’s point of view, the information is the signal ‘green,’ and the meaning is set by the context.”

“You’re going somewhere with this, I suppose?”

“Mm-hm.  In Shannon’s theory, the more surprising the message is, the more information it contains.  Remember when you told me that in one of your alternate realities you’d seen me wearing a green shirt?  That was a surprise and it told me you’d visited an unusual reality, because I rarely wear green.  If you’d told me the shirt was black or grey, that would have been much less surprising and much less informative.  Shannon’s trick was in putting numbers to that.”

“You’re just dragging this out, aren’t you?”

“No-no, only two more steps to the elephant.  First step is that Shannon defined a particular signal’s information content to be proportional to the negative of the logarithm of its probability.  Suppose I’m maybe 1% likely to wear green but equally likely to wear any of the other 11 colors.  Each of those colors has a 9% probability.  log10(1%) is –2.00, information content is 2.00, but –log10(9%) is only 1.04.  By Shannon’s definition when you said ‘green’ in this context, you gave me nearly double the information as any of the other color names.”

“Why’d you use base-10 logarithms?”

“Convenience.  It’s easy to figure log10(1%).  Information scientists tend to use base-2, physicists go for base-e.  Final step — Shannon took the information content of each possible signal, multiplied it by the probability of that signal, added those products together and called it the signal system’s information entropy. For our colors it’d be 2.0+(11×1.04)=13.44.  Regardez, voici l’éléphant!”

“Ooo, French!”

Aimeriez-vous un croissant et un café?  My treat at Al’s.

~~ Rich Olcott

Two Sharp Dice

<further along our story arc>  “Want a refill?”

“No, I’ve had enough.  But I could go for some dessert.”

“Nothing here in the office, care for some gelato?”

We take the elevator down to Eddie’s on 2.  Things are slow.  Jeremy’s doing homework behind the gelato display.  Eddie’s at the checkout counter, rolling some dice.  He gives the eye to her white satin.  “You’ll fit right in when the theater crowd gets here, Miss.  Don’t know about you, Sy.”White satin and dice

“Fitting in’s not my thing, Eddie.  This is my client, Anne.  What’s with the bones?”

“Weirdest thing, Sy.  I’m getting set up for the game after closing (don’t tell nobody, OK?) but these dice gotta be bad somehow.  I roll just one, I get every number, but when I roll the two together I get nothin’ but snake-eyes and boxcars.”

I shoot Anne a look.  She shrugs.  I sing out, “Hey, Jeremy, my usual chocolate-hazelnut combo.  For the lady … I’d say vanilla and mint.”

She shoots me a look.  “How’d you know?”

I shrug.  “Lucky guess.  It’s a good evening for the elephant.”

“Hey, no livestock in here, Sy, the Health Department would throw a fit!”

“It’s an abstract elephant, Eddie.  Anne and I’ve been discussing entropy.  Which is an elephant because it’s got so many aspects no-one can agree on what it is.”

“So it’s got to do with luck?”

“With counting possibilities.  Suppose you know something happened, but there’s lots of ways it could have happened.  You don’t know which one it was.  Entropy is a way to measure what’s left to know.”

“Like what?”

“Those dice are an easy example.  You throw the pair, they land in any of 36 different ways, but you don’t know which until you look, right?”

Dice odds

“Yeah, sure.  So?”

“So your uncertainty number is 36.  Suppose they show 7.  There’s still half-a-dozen ways that can happen — first die shows 6, second shows 1, or maybe the first die has the 1 and the second has the 6, and so on.  You don’t know which way it happened.  Your uncertainty number’s gone down from 36 to 6.”

“Wait, but I do know something going in.  It’s a lot more likely they’ll show a 7 than snake-eyes.”

“Good point, but you’re talking probability, the ratio of uncertainty numbers.  Half-a-dozen ways to show a 7, divided by 36 ways total, means that 7 comes up seventeen throws out of a hundred.  Three times out of a hundred you’ll get snake-eyes.  Same odds for boxcars.”

“C’mon, Sy, in my neighborhood little babies know those odds.”

“But do the babies know how odds combine?  If you care about one event OR another you add the odds, like 6 times out of a hundred you get snake-eyes OR boxcars.  But if you’re looking at one event AND another one the odds multiply.  How often did you roll those dice just now?”

“Couple of dozen, I guess.”

“Let’s start with three.  Suppose you got snake-eyes AND you got snake-eyes AND you got snake-eyes.  Odds on that would be 3×3×3 out of 100×100×100 or 27 out of a million triple-throws.  Getting snake-eyes or boxcars 24 times in a row, that’s … ummm … less than one chance in a million trillion trillion sets of 24-throws.  Not likely.”

“Don’t know about the numbers, Sy, but there’s something goofy with these dice.”

Anne cuts in.  “Maybe not, Eddie.  Unusual things do happen.  Let me try.”  She gets half-a-dozen 7s in a row, each time a different way.  “Now you try,” and gives him back the dice.  Now he rolls an 8, 9, 10, 11 and 12 in order.  “They’re not loaded.  You’re just living in a low-probability world.”

“Aw, geez.”

“Anyway, Eddie, entropy is a measure of residual possibilities — alternate conditions (like those ways to 7) that give identical results.  Suppose a physicist is working on a system with a defined number of possible states.  If there’s some way to calculate their probabilities, they can be plugged into a well-known formula for calculating the system’s entropy.  The remarkable thing, Anne, is that what you calculate from the formula matches up with the heat capacity entropy.”

“Here’s your gelato, Mr Moire.   Sorry for the delay, but Jennie dropped by and we got to talking.”

Anne and I trade looks.  “That’s OK, Jeremy, I know how that works.”

~~ Rich Olcott

Enter the Elephant, stage right

Anne?”

“Mm?”

“Remember when you said that other reality, the one without the letter ‘C,’  felt more probable than this one?”

“Mm-mm.”

“What tipped you off?”

Now you’re asking?”

“I’m a physicist, physicists think about stuff.  Besides, we’ve finished the pizza.”

<sigh> “This conversation has gotten pretty improbable, if you ask me.  Oh, well.  Umm, I guess it’s two things.  The more-probable realities feel denser somehow, and more jangly. What got you on this track?”

“Conservation of energy.  Einstein’s E=mc² says your mass embodies a considerable amount of energy, but when you jump out of this reality there’s no flash of light or heat, just that fizzing sound.  When you come back, no sudden chill or things falling down on us, just the same fizzing.  Your mass-energy that has to go to or come from somewhere.  I can’t think where or how.”

“I certainly don’t know, I just do it.  Do you have any physicist guesses?”

“Questions first.”

“If you must.”

“It’s what I do.  What do you perceive during a jump?  Maybe something like falling, or heat or cold?”

“There’s not much ‘during.’  It’s not like I go through a tunnel, it’s more like just turning around.  What I see goes out of focus briefly.  Mostly it’s the fizzy sound and I itch.”

“Itch.  Hmm…  The same itch every jump?”

“That’s interesting.  No, it’s not.  I itch more if I jump to a more-probable reality.”

Very interesting.  I’ll bet you don’t get that itch if you’re doing a pure time-hop.”

“You’re right!  OK, you’re onto something, give.”

“You’ve met one of my pet elephants.”

“Wha….??”White satin and elephant

“A deep question that physics has been nibbling around for almost two centuries.  Like the seven blind men and the elephant.  Except the physicists aren’t blind and the elephant’s pretty abstract.  Ready for a story?”

“Pour me another and I will be.”

“Here you go.  OK, it goes back to steam engines.  People were interested in getting as much work as possible out of each lump of coal they burned.  It took a couple of decades to develop good quantitative concepts of energy and work so they could grade coal in terms of energy per unit weight, but they got there.  Once they could quantify energy, they discovered that each material they measured — wood, metals, water, gases — had a consistent heat capacity.  It always took the same amount of energy to raise its temperature across a given range.  For a kilogram of water at 25°C, for instance, it takes one kilocalorie to raise its temperature to 26°C.  Lead and air take less.”

“So where’s the elephant come in?”

“I’m getting there.  We started out talking about steam engines, remember?  They work by letting steam under pressure push a piston through a cylinder.  While that’s happening, the steam cools down before it’s puffed out as that classic old-time Puffing Billy ‘CHUFF.’  Early engine designers thought the energy pushing the piston just came from trading off pressure for volume.  But a guy named Carnot essentially invented thermodynamics when he pointed out that the cooling-down was also important.  The temperature drop meant that heat energy stored in the steam must be contributing to the piston’s motion because there was no place else for it to go.”

“I want to hear about the elephant.”

“Almost there.  The question was, how to calculate the heat energy.”

“Why not just multiply the temperature change by the heat capacity?”

“That’d work if the heat capacity were temperature-independent, which it isn’t.  What we do is sum up the capacity at each intervening temperature.  Call the sum ‘elephant’ though it’s better known as Entropy.  Pressure, Volume, Temperature and Entropy define the state of a gas.  Using those state functions all you need to know is the working fluid’s initial and final state and you can calculate your engine.  Engineers and chemists do process design and experimental analysis using tables of reported state function values for different substances at different temperatures.”

“Do they know why heat capacity changes?”

“That took a long time to work out, which is part of why entropy’s an elephant.  And you’ve just encountered the elephant’s trunk.”

“There’s more elephant?”

“And more of this.  Want a refill?”

~~ Rich Olcott