Schrödinger’s Elephant

Al’s coffee shop sits right between the Astronomy and Physics buildings, which is good because he’s a big Science fan.  He and Jeremy are in an excited discussion when Anne and I walk in.  “Two croissants, Al, and two coffees, black.”

“Comin’ up, Sy.  Hey, you see the news?  Big days for gravitational astronomy.”

Jeremy breaks in.  “There’s a Nobel Prize been announced —”

“Kip Thorne the theorist and Barry Barish the management guy —”

“and Rainer Weiss the instrumentation wizard —”

“shared the Physics prize for getting LIGO to work —”

“and it saw the first signal of a black hole collision in 2015 —”

“and two more since —”

“and confirmed more predictions from relativity theory —”

“and Italy’s got their Virgo gravitational wave detector up and running —”

“And Virgo and our two LIGOs, —”

“Well, they’re both aLIGOs now, being upgraded and all —”

“all three saw the same new wave —”

“and it’s another collision between black holes with weird masses that we can’t account for.  Who’s the lady?”

“Al, this is Anne.  Jeremy, close your mouth, you’ll catch a fly.”  (Jeremy blushes, Anne twinkles.)  “Anne and I are chasing an elephant.”

“Pleased to meetcha, Anne.  But no livestock in here, Sy, the Health Department would throw a fit!”

I grin.  “That’s exactly what Eddie said.  It’s an abstract elephant, Al.  We’ve been discussing entropy. Which is an elephant because it’s got so many aspects no-one can agree on what it is.  It’s got something to do with heat capacity, something to do with possibilities you can’t rule out, something to do with signals and information.  And Hawking showed that entropy also has something to do with black holes.”

“Which I don’t know much about, fellows, so someone will have to explain.”

Jeremy leaps in.  “I can help with that, Miss Anne, I just wrote a paper on them.”

“Just give us the short version, son, she can ask questions if she wants a detail.”

“Yessir.  OK, suppose you took all the Sun’s mass and squeezed it into a ball just a few miles across.  Its density would be so high that escape velocity is faster than the speed of light so an outbound photon just falls back inward and that’s why it’s black.  Is that a good summary, Mr Moire?”

“Well, it might be good enough for an Internet blog but it wouldn’t pass inspection for a respectable science journal.  Photons don’t have mass so the whole notion of escape velocity doesn’t apply.  You do have some essential elements right, though.  Black holes are regions of extreme mass density, we think more dense than anywhere else in the Universe.  A black hole’s mass bends space so tightly around itself that nearby light waves are forced to orbit its region or even spiral inward.  The orbiting happens right at the black hole’s event horizon, its thin shell that encloses the space where things get really weird.  And Anne, the elephant stands on that shell.”white satin and black hole“Wait, Mr Moire, we said that the event horizon’s just a mathematical construct, not something I could stand on.”

“And that’s true, Jeremy.  But the elephant’s an abstract construct, too.  So abstract we’re still trying to figure out what’s under the abstraction.”

“I’m trying to figure out why you said the elephant’s standing there.”

“Anne, it goes back to the event horizon’s being a mathematical object, not a real one.  Its spherical surface marks the boundary of the ultimate terra incognita.  Lightwaves can’t pass outward from it, nor can anything material, not even any kind of a signal.  For at least some kinds of black hole, physicists have proven that the only things we can know about one are its mass, spin and charge.  From those we can calculate some other things like its temperature, but black holes are actually pretty simple.”

“So?”

“So there’s a collision with Quantum Theory.  One of QT’s fundamental assumptions is that in principle we can use a particle’s current wave function to predict probabilities for its future.  But the wave function information disappears if the particle encounters an event horizon.  Things are even worse if the particle’s entangled with another one.”

“Information, entropy, elephant … it’s starting to come together.”

“That’s what he said.”

~~ Rich Olcott

Advertisements

At The Turn of A Card

Not much going on today.  I’m dealing myself a hand of solitaire when I hear a familiar fizzing sound.  “Hello, Anne.  Good to see you again.”

She’s freshened up that white satin outfit and is looking very good.  “Hello, Sy.  Busy?”

“Not so’s you’d notice it.  What can I do for you?”

“Can’t a girl just drop in when she wants to visit?  Playing with real cards, I see.  That’s good, but your tens and treys are frozen.”white satin and cards

“That’s the way the odds break sometimes.  The elephant‘s in the room.”

Entropy again?  What’s it look like this time?”

“These cards and surprise.  How surprised would you be if I were to draw a queen from the stock pile?”

“No queens showing, so some surprised but not very surprised.”

“You know me, I’m a physicist, we put numbers to things.  So put numbers to the situation.”

<sigh>  “OK, there are 52 cards in the deck and you’ve got … 28 cards in that triangle, so there are 24 left in the stock.  Four of them have to be queens.  Four out of 24 is one out of 6.”

“Or 17%.  And the odds for the queen of hearts?”

“I’m here so it’s 100% until I leave.  Oh, I know, you’re talking about the cards.  One in 24 or 4%.  So I’d be four times as surprised at seeing the heart queen as I would at seeing any of them.  Pooh.”

“Now how about the odds of drawing all four queens?”

“One in 24, times one in 23, times one in 22, times one in 21.  Whatever, it’s a very small number and I’d be very surprised.”

“Well, here’s where we get another look at the elephant.  There’s a definition of entropy that links directly to those percentages AND can handle extremely small ones.  What do you know about logarithms?”

“A little.  I read your   last   series  of  posts.”

“Wonderful, that simplifies things.  Let’s start with strange dissociation thought up by Claude Shannon to whom we owe the entire field of information theory.  His crucial insight was that he had to distinguish between information and meaning.”

“How can they be different?  If I say ‘green’ that means, well, green.”

“It’s all about context.  If you’re telling me what color something is, saying ‘green’ is telling me that the thing isn’t white or red or any of the other umm, nine colors I know the names of.  But if you’re telling me someone is really inexperienced then I know not to trust them with a complicated task that has to be done right the first time.  From Shannon’s point of view, the information is the signal ‘green,’ and the meaning is set by the context.”

“You’re going somewhere with this, I suppose?”

“Mm-hm.  In Shannon’s theory, the more surprising the message is, the more information it contains.  Remember when you told me that in one of your alternate realities you’d seen me wearing a green shirt?  That was a surprise and it told me you’d visited an unusual reality, because I rarely wear green.  If you’d told me the shirt was black or grey, that would have been much less surprising and much less informative.  Shannon’s trick was in putting numbers to that.”

“You’re just dragging this out, aren’t you?”

“No-no, only two more steps to the elephant.  First step is that Shannon defined a particular signal’s information content to be proportional to the negative of the logarithm of its probability.  Suppose I’m maybe 1% likely to wear green but equally likely to wear any of the other 11 colors.  Each of those colors has a 9% probability.  log10(1%) is -2.00, information content is 2.00, but -log10(9%) is only 1.04.  By Shannon’s definition when you said ‘green’ in this context, you gave me nearly double the information as any of the other color names.”

“Why’d you use base-10 logarithms?”

“Convenience.  It’s easy to figure log10(1%).  Information scientists tend to use base-2, physicists go for base-e.  Final step — Shannon took the information content of each possible signal, multiplied it by the probability of that signal, added those products together and called it the signal system’s information entropy. For our colors it’d be 2.0+(11×1.04)=13.44.  Regardez, voici l’éléphant!”

“Ooo, French!”

“Aimeriez-vous un croissant et un café?  My treat at Al’s.

~~ Rich Olcott

Two Sharp Dice

<further along our story arc>  “Want a refill?”

“No, I’ve had enough.  But I could go for some dessert.”

“Nothing here in the office, care for some gelato?”

We take the elevator down to Eddie’s on 2.  Things are slow.  Jeremy’s doing homework behind the gelato display.  Eddie’s at the checkout counter, rolling some dice.  He gives the eye to her white satin.  “You’ll fit right in when the theater crowd gets here, Miss.  Don’t know about you, Sy.”White satin and dice

“Fitting in’s not my thing, Eddie.  This is my client, Anne.  What’s with the bones?”

“Weirdest thing, Sy.  I’m getting set up for the game after closing (don’t tell nobody, OK?) but these dice gotta be bad somehow.  I roll just one, I get every number, but when I roll the two together I get nothin’ but snake-eyes and boxcars.”

I shoot Anne a look.  She shrugs.  I sing out, “Hey, Jeremy, my usual chocolate-hazelnut combo.  For the lady … I’d say vanilla and mint.”

She shoots me a look.  “How’d you know?”

I shrug.  “Lucky guess.  It’s a good evening for the elephant.”

“Hey, no livestock in here, Sy, the Health Department would throw a fit!”

“It’s an abstract elephant, Eddie.  Anne and I’ve been discussing entropy.  Which is an elephant because it’s got so many aspects no-one can agree on what it is.”

“So it’s got to do with luck?”

“With counting possibilities.  Suppose you know something happened, but there’s lots of ways it could have happened.  You don’t know which one it was.  Entropy is a way to measure what’s left to know.”

“Like what?”

“Those dice are an easy example.  You throw the pair, they land in any of 36 different ways, but you don’t know which until you look, right?”

Dice odds

“Yeah, sure.  So?”

“So your uncertainty number is 36.  Suppose they show 7.  There’s still half-a-dozen ways that can happen — first die shows 6, second shows 1, or maybe the first die has the 1 and the second has the 6, and so on.  You don’t know which way it happened.  Your uncertainty number’s gone down from 36 to 6.”

“Wait, but I do know something going in.  It’s a lot more likely they’ll show a 7 than snake-eyes.”

“Good point, but you’re talking probability, the ratio of uncertainty numbers.  Half-a-dozen ways to show a 7, divided by 36 ways total, means that 7 comes up seventeen throws out of a hundred.  Three times out of a hundred you’ll get snake-eyes.  Same odds for boxcars.”

“C’mon, Sy, in my neighborhood little babies know those odds.”

“But do the babies know how odds combine?  If you care about one event OR another you add the odds, like 6 times out of a hundred you get snake-eyes OR boxcars.  But if you’re looking at one event AND another one the odds multiply.  How often did you roll those dice just now?”

“Couple of dozen, I guess.”

“Let’s start with three.  Suppose you got snake-eyes AND you got snake-eyes AND you got snake-eyes.  Odds on that would be 3×3×3 out of 100×100×100 or 27 out of a million triple-throws.  Getting snake-eyes or boxcars 24 times in a row, that’s … ummm … less than one chance in a million trillion trillion sets of 24-throws.  Not likely.”

“Don’t know about the numbers, Sy, but there’s something goofy with these dice.”

Anne cuts in.  “Maybe not, Eddie.  Unusual things do happen.  Let me try.”  She gets half-a-dozen 7s in a row, each time a different way.  “Now you try,” and gives him back the dice.  Now he rolls an 8, 9, 10, 11 and 12 in order.  “They’re not loaded.  You’re just living in a low-probability world.”

“Aw, geez.”

“Anyway, Eddie, entropy is a measure of residual possibilities — alternate conditions (like those ways to 7) that give identical results.  Suppose a physicist is working on a system with a defined number of possible states.  If there’s some way to calculate their probabilities, they can be plugged into a well-known formula for calculating the system’s entropy.  The remarkable thing, Anne, is that what you calculate from the formula matches up with the heat capacity entropy.”

“Here’s your gelato, Mr Moire.   Sorry for the delay, but Jennie dropped by and we got to talking.”

Anne and I trade looks.  “That’s OK, Jeremy, I know how that works.”

~~ Rich Olcott

Enter the Elephant, stage right

Anne?”

“Mm?”

“Remember when you said that other reality, the one without the letter ‘C,’  felt more probable than this one?”

“Mm-mm.”

“What tipped you off?”

Now you’re asking?”

“I’m a physicist, physicists think about stuff.  Besides, we’ve finished the pizza.”

<sigh> “This conversation has gotten pretty improbable, if you ask me.  Oh, well.  Umm, I guess it’s two things.  The more-probable realities feel denser somehow, and more jangly. What got you on this track?”

“Conservation of energy.  Einstein’s E=mc² says your mass embodies a considerable amount of energy, but when you jump out of this reality there’s no flash of light or heat, just that fizzing sound.  When you come back, no sudden chill or things falling down on us, just the same fizzing.  Your mass-energy that has to go to or come from somewhere.  I can’t think where or how.”

“I certainly don’t know, I just do it.  Do you have any physicist guesses?”

“Questions first.”

“If you must.”

“It’s what I do.  What do you perceive during a jump?  Maybe something like falling, or heat or cold?”

“There’s not much ‘during.’  It’s not like I go through a tunnel, it’s more like just turning around.  What I see goes out of focus briefly.  Mostly it’s the fizzy sound and I itch.”

“Itch.  Hmm…  The same itch every jump?”

“That’s interesting.  No, it’s not.  I itch more if I jump to a more-probable reality.”

Very interesting.  I’ll bet you don’t get that itch if you’re doing a pure time-hop.”

“You’re right!  OK, you’re onto something, give.”

“You’ve met one of my pet elephants.”

“Wha….??”White satin and elephant

“A deep question that physics has been nibbling around for almost two centuries.  Like the seven blind men and the elephant.  Except the physicists aren’t blind and the elephant’s pretty abstract.  Ready for a story?”

“Pour me another and I will be.”

“Here you go.  OK, it goes back to steam engines.  People were interested in getting as much work as possible out of each lump of coal they burned.  It took a couple of decades to develop good quantitative concepts of energy and work so they could grade coal in terms of energy per unit weight, but they got there.  Once they could quantify energy, they discovered that each material they measured — wood, metals, water, gases — had a consistent heat capacity.  It always took the same amount of energy to raise its temperature across a given range.  For a kilogram of water at 25°C, for instance, it takes one kilocalorie to raise its temperature to 26°C.  Lead and air take less.”

“So where’s the elephant come in?”

“I’m getting there.  We started out talking about steam engines, remember?  They work by letting steam under pressure push a piston through a cylinder.  While that’s happening, the steam cools down before it’s puffed out as that classic old-time Puffing Billy ‘CHUFF.’  Early engine designers thought the energy pushing the piston just came from trading off pressure for volume.  But a guy named Carnot essentially invented thermodynamics when he pointed out that the cooling-down was also important.  The temperature drop meant that heat energy stored in the steam must be contributing to the piston’s motion because there was no place else for it to go.”

“I want to hear about the elephant.”

“Almost there.  The question was, how to calculate the heat energy.”

“Why not just multiply the temperature change by the heat capacity?”

“That’d work if the heat capacity were temperature-independent, which it isn’t.  What we do is sum up the capacity at each intervening temperature.  Call the sum ‘elephant’ though it’s better known as Entropy.  Pressure, Volume, Temperature and Entropy define the state of a gas.  Using those state functions all you need to know is the working fluid’s initial and final state and you can calculate your engine.  Engineers and chemists do process design and experimental analysis using tables of reported state function values for different substances at different temperatures.”

“Do they know why heat capacity changes?”

“That took a long time to work out, which is part of why entropy’s an elephant.  And you’ve just encountered the elephant’s trunk.”

“There’s more elephant?”

“And more of this.  Want a refill?”

~~ Rich Olcott