A Matter of Degree

“Wait, Sy, you said something about my matryoshkacascade multiverse, that the speed of light might not match between mama and baby Universes. How can that be?”

“Deep question, Susan. The answer is that we don’t know. Maybe gravitational stress at a supermassive black hole’s singularity is intense enough to birth a new Universe inside the Event Horizon, or maybe not. Suppose it does. We don’t have theories strong enough to determine whether the speed of light inside there would or would not match the one we have out here.”

“Talk about pregnant questions.” <sips latte> “Ah! Here’s another thing. Both my matryoshki and your bubbly multiverse are about spreading Universes across space. Neither one addresses the timeline splits we started talking about. Maybe I decide on noodles for lunch and another me in a different Universe opts for a sandwich, but how about one me that splits to follow parallel paths right here? Could a multiverse work that way?”

“Another deep question. Timeline splits require a fivedimensional spacetime. Want to talk about that?”

“Just a moment. Oh, Al, can I have another mocha latte, please, and add a dash of peppermint to it.”

“That’s a change from your usual recipe, Susan.”

“Yes,” <side glance my way> “I’m splitting my timeline. Thanks, Al. Ok, Sy, let’s go for it.”

“It’s about degrees of freedom.”

“I like freedom, but I didn’t know it comes in degrees.”

“In certain contexts that’s a matter of geography, law and opinion. I’m talking Physics here. For physicists each degree of freedom in a system is a relevant variable that’s independent of other specifications. Location parameters are a prime example. On a Star Trek vessel, how does the Captain specify a heading?”

“When they know where they’re going she’ll say ‘Set coordinates for‘ wherever, but for a course change she’ll say ‘some‑number MARK some‑number‘. Ah, got it — that’s like latitude and longitude, two arcs along perpendicular circles. Two angles and a distance to the target make three degrees of freedom, right?”

“A‑k‑a three dimensions of space. How about time?”

“All you can do is go forward, no freedom.”

“Not quite. Conceptually at least, you can go forward and back. Timewise we’re moving along a line. That’s a one‑dimensional thing. Combine time and space as Minkowski recommended and you’ve got a four‑dimensional spacetime. Relativity may serve us time at different rates but we’re still trapped on that line.”

“Ah, now I see why you said five dimensions. High school geometry — you’d need a second time dimension to angle away from the one we’re on. Ooo, if it’s an angle we could do time‑trigonometry, like the sine would measure how different two timelines get divided by how long it took to get there.”

“Cute idea, Susan, but defining time fractures in terms of time would be a challenge. I think a better metric would be probability, like what are the odds that things would be this different?”

A rustle of satin behind me and a familiar voice like molten silver. “Hello, Sy, I read your posts about multiverses so I thought I’d drop by. You’re Susan? Hi, my name’s Anne.”

“Um … hello.” Anne is kind of breath‑taking.

“Hi, Anne. It’s been a while. Funny you should show up just as we’re getting to the idea of a probability dimension.”

“Mm-hm, how ’bout that? Sorry, Susan, but time‑trig won’t work. I’ve got a better idea for you. Sy’s physicists are so used to thinking thermodynamically. Entropy’s based on probability, isn’t it, Sy? The split‑off dimension should be marked off in units of information entropy.” <giggle> “You haven’t told Susan your twenty‑dimension idea yet, have you?”

“Anne, you’ve always been too fast for me. Susan, the Physics we have so far still has about twenty fundamental constants — numbers like the speed of light — whose values we can’t explain in our best models of how things work. Think of each as a coordinate in a twenty‑plus‑four-dimensional hyper‑Universe. The Anthropic Principle says we and my entire bubble Universe happen to be at the twenty‑way intersection where those coordinates are just right for life to exist. Each of your matryoshki Universes may or may not be there. “

“Lucky, aren’t we?”

~~ Rich Olcott

Candle, Candle, Burning Bright

<chirp, chirp> “Moire here.”

“Hi, Sy, it’s Susan Kim. I did a little research after our chat. The whale oil story isn’t quite what we’re told.”

“Funny, I’ve been reading up on whales, too. So what’s your chemical discovery?”

“What do we get from a fire, Sy?”

“Light, heat and leftovers.”

“Mm-hm, and back in 18th Century America, there was plenty of wood and coal for heat. Light was the problem. I can’t imagine young Abe Lincoln reading by the flickering light of his fireplace — he must have had excellent eyesight. If you wanted a mostly steady light you burned some kind of fat, either wax candles or oil lamps.”

“Wait, aren’t fat and wax and oil three different things?”

“Not to a chemist. Fat’s the broadest category, covers molecules and mixtures with chains of ‑CH2‑ groups that don’t dissolve in water. Maybe the chains include a few oxygen atoms but the molecules are basically hydrocarbons. Way before we knew about molecules, though, we started classifying fats by whether or not the material is solid at room temperature. Waxes are solid, oils are liquid. You’re thinking about waxy‑looking coconut oil, aren’t you?”

“Well….”

“Coconuts grow where rooms are warm so we call it an oil, OK? I think it’s fun that you can look at a molecular structure and kind of predict whether the stuff will be waxy or oily.”

“How do you do that?”

“Mmm… It helps to know that a long chain of ‑CH2‑ groups tends to be straight‑ish but if there’s an ‑O‑ link in the chain the molecule can bend and even rotate there. Also, you get a kink in the chain wherever there’s a –CH=CH– double bond. We call that a point of unsaturation.”

“Ah, there’s a word I recognize, from foodie conversations. Saturated, unsaturated, polyunsaturated — that’s about double bonds?”

“Yup. So what does your physicist intuition make of all that?”

“I’d say the linear saturated molecules ought to pack together better than the bendy unsaturated ones. Better packing means lower entropy, probably one of those solid waxes. The more unsaturation or more ‑O‑ links, the more likely something’s an oil. How’d I do?”

“Spot on, Sy. Now carry it a step further. Think of a –CH2– chain as a long methane. How do suppose the waxes and oils compare for burning?”

“Ooo, now that’s interesting. O2 has much better access to fuel molecules if they’re in the gas phase so a good burn would be a two‑step process — first vaporization and then oxidation. Oils are already liquid so they’d go gaseous more readily than an orderly solid wax of the same molecular weight. Unless there’s something about the –O– links that ties molecules together…”

“Some kinds have hydrogen-bond bridging but most of them don’t.”

“OK. Then hmm… Are the double-bond kinks more vulnerable to oxygen attack?”

“They are, indeed, which is why going rancid is a major issue with the polyunsaturated kinds.”

“Oxidized hydrocarbon fragments can be stinky, huh? Then I’d guess that oil flames tend to be smellier than wax flames. And molecules we smell aren’t getting completely oxidized so the flame would probably be smokier, too. And sootier. Under the same conditions, of course.”

“Uh-huh. Would you be surprised if I told you that flames from waxes tend to be hotter than the ones from oils?”

“From my experience, not surprised. Beeswax candlelight is brighter and whiter than the yellow‑orange light I saw when the frying oil caught fire. Heat glow changes red to orange to yellow to white as the source gets hotter. Why would the waxes burn hotter?”

“I haven’t seen any studies on it. I like to visualize those straight chains as candles burning from the ends and staying alight longer than short oil fragments can, but that’s a guess. Ironic that a hydrogen flame is just a faint blue, even though it’s a lot hotter than any hydrocarbon flame. Carbon’s the key to flamelight. Anyway, the slaughter started when we learned a mature sperm whale’s head holds 500 gallons of waxy spermaceti that burns even brighter than beeswax.”

~~ Rich Olcott

  • Whale image adapted from a photo by Gabriel Barathieu CC BY SA 2.0

Three Ways To Get Dizzy

<FZzzzzzzzzzzzzzzzzzzzzzzttt!> “Urk … ulp … I need to sit down, quick.”

“Anne? Welcome back, the couch is over there. Goodness, you do look a little green. Can I get you something to drink?”

“A little cool water might help, thanks.”

“Here. Just sit and breathe. That wasn’t your usual fizzing sound when you visit my office. When you’re ready tell me what happened. Must have been an experience, considering some of your other superpower adventures. Where did you ‘push‘ to this time?”

“Well, you know when I push forward I go into the future and when I push backward I go into the past. When I push up or down I get bigger or smaller. You figured out how pushing sideways kicks me to alternate probabilities. And then <shudder> there was that time I found a new direction to push and almost blew up the Earth.”

“Yes, that was a bad one. I’d think you’ve pretty well used up all the directions, though.”

“Not quite. This time I pushed outwards, the same in every direction.”

“Creative. And what happened?”

“Suddenly I was out in deep space, just tumbling in the blackness. There wasn’t an up or down or anything. I couldn’t even tell how big I was. I could see stars way off in the distance or maybe they were galaxies, but they were spinning all crazy. It took me a minute to realize it was me that was spinning, gyrating in several ways at once. It was scary and nauseating but I finally stopped part of it.”

“Floating in space with nothing to kill your angular momentum … how’d you manage to stabilize yourself at all?”

“Using my push superpower, of course. The biggest push resistance is against the past. I pulled pastward from just my shoulders and that stopped my nose‑diving but I was still whirling and cart‑wheeling. I tried to stop that with my feet but that only slowed me down and I was getting dizzy. My white satin had transformed into a spacesuit and I definitely didn’t want to get sick in there so I came home.”

“How’d you do that?”

“Oh, that was simple, I pulled inward. I had to um, zig‑zag? until I got just the right amount.”

“That explains the odd fizzing. I’m glad you got back. Looks like you’re feeling better now.”

“Mostly. Whew! So, Mr Physicist Sy, help me understand it all. <her voice that sounds like molten silver> Please?”

“Well. Um. There’s a couple of ways to go here. I’ll start with degrees of freedom, okay?”

“Whatever you say.”

“Right. You’re used to thinking in straight‑line terms of front/back, left/right and up/down, which makes sense if you’re on a large mostly‑flat surface like on Earth. In mathspeak each of those lines marks an independent degree of freedom because you can move along it without moving along either of the other two.”

“Like in space where I had those three ways to get dizzy.”

“Yup, three rotations at right angles to each other. Boatmen and pilots call them pitch, roll and yaw. Three angular degrees of freedom. Normal space adds three x-y-z straight‑line degrees, but you wouldn’t have been able to move along those unless you brought along a rocket or something. I guess you didn’t, otherwise you could have controlled that spinning.”

“Why would I have carried a rocket when I didn’t know where I was going? Anyhow, my push‑power can drive my straight‑line motion except I didn’t know where I was and that awful spinning had me discombobulated”

“Frankly, I’m glad I don’t know how you feel. Anyhow, if measurable motion is defined along a degree of freedom the measurement is called a coordinate. Simple graphs have an x-coordinate and a y-coordinate. An origin plus almost any three coordinates makes a coordinate system able to locate any point in space. The Cartesian x-y-z system uses three distances or you can have two distances and an angle, that’s cylindrical coordinates, or two angles and one distance and that’s polar coordinates.”

“Three angles?”

“You don’t know where you are.”

<shudder>
 <shudder>

~~ Rich Olcott

The Latte Connection

An early taste of Spring’s in the air so Al’s set out tables in front of his coffee shop. I’m enjoying my usual black mud when the Chemistry Department’s Susan Kim passes by carrying her usual mocha latte. “Hi, Sy, mind if I take the socially distant chair at your table?”

“Be my guest, Susan. What’s going on in your world?”

“I’ve been enjoying your hysteresis series. It took me back to Physical Chemistry class. I’m intrigued by how you connected it to entropy.”

“How so?”

“I think of hysteresis as a process, but entropy is a fixed property of matter. If I’m holding twelve grams of carbon at room temperature, I know what its entropy is.”

“Mmm, sorta. Doesn’t it make a difference whether the carbon’s a 60‑carat diamond or just a pile of soot?”

“OK, I’ll give you that, the soot’s a lot more random than the diamond so its entropy is higher. The point remains, I could in principle measure a soot sample’s heat capacity at some convenient temperature and divide that by the temperature. I could repeat that at lower and lower temperatures down to near absolute zero. When I sum all those measurements I’ll have the entropy content of the sample at my starting temperature.”

“A classical definition, just what I’d expect from a chemist. But suppose your soot spills out of its test tube and the breeze spreads it all over the neighborhood. More randomness, higher entropy than what you measured, right?”

“Well, yes. I wouldn’t have a clue how to calculate it, but that goes way beyond Carnot’s and Clausius’ original concept.”

“So entropy has at least a thin linkage with history and hysteresis. To you chemists, though, an element or compound is timeless — lead or water have always been lead or water, and their physical constants are, well, constant.”

“Not quite true, Sy. Not with really big molecules like proteins and DNA and rubber and some plastics. Squirt a huge protein like catalase through a small orifice and its properties change drastically. It might not promote any reaction, much less the one Nature designed it for. Which makes me think — Chemistry is all about reactions and they take time and studying what makes reactions run fast or slow is a big part of the field. So we do pay attention to time.”

“Nice play, Susan! You’re saying small molecules aren’t complex enough to retain memories but big ones are. I’ll bet big molecules probably exhibit hysteresis.”

“Sure they do. Rubber molecules are long-chain polymers. Quickly stretch a rubber band to its limit, hold it there a few seconds then let go. Some of the molecular strands lock into the stretched configuration so the band won’t immediately shrink all the way down to its original size. There’s your molecular memory.”

“And a good example it is — classic linear Physics. How much force you exert, times the distance you applied it through, equals the energy you expended. Energy’s stored in the rubber’s elasticity when you stretch it, and the energy comes back out on release.”

“Mostly right, Sy. You actually have to put in more energy than you get out — Second Law of Thermodynamics, of course — and the relationship’s not linear. <rummaging into purse> Thought I had a good fat rubber band somewhere … ah‑hah! Here, stretch this out while you hold it against your forehead. Feel it heat up briefly? Now keep checking for heat while you relax the band.”

“Hey, it got cold for a second!”

“Yep. The stretched-out configuration is less random so its entropy and heat capacity are lower than the relaxed configuration’s. The stretched band had the same amount of heat energy but with less heat required per degree of temperature, that amount of energy made the band hotter. Relaxing the band let its molecules get less orderly. Heat capacity went back up. temperature went back down.”

“Mmm-HM. My hysteresis diagram’s upward branch is stretch energy input and the downward branch is elastic energy output. The energy difference is the area inside the hysteresis curve, which is what’s lost to entropy in each cycle and there we have your intriguing entropy‑hysteresis connection. Still intrigued?”

“Enough for another latte.”

~~ Rich Olcott

Free Energy, or Not

From: Richard Feder <rmfeder@fortleenj.com>
To: Sy Moire <sy@moirestudies.com>
Subj: Questions

What’s this about “free energy”? Is that energy that’s free to move around anywhere? Or maybe the vacuum energy that this guy said is in the vacuum of space that will transform the earth into a wonderful world of everything for free for everybody forever once we figure out how to handle the force fields and pull energy out of them?


From: Sy Moire <sy@moirestudies.com>
To: Richard Feder <rmfeder@fortleenj.com>

Subj: Re: Questions

Well, Mr Feder, as usual you have a lot of questions all rolled up together. I’ll try to take one at a time.

It’s clear you already know that to make something happen you need energy. Not a very substantial definition, but then energy is an abstract thing it took humanity a couple of hundred years to get our minds around and we’re still learning.

Physics has several more formal definitions for “energy,” all clustered around the ability to exert force to move something and/or heat something up. The “and/or” is the kicker, because it turns out you can’t do just the moving. As one statement of the Second Law of Thermodynamics puts it, “There are no perfectly efficient processes.”

For example, when your car’s engine burns a few drops of gasoline in the cylinder, the liquid becomes a 22000‑times larger volume of hot gas that pushes the piston down in its power stroke to move the car forward. In the process, though, the engine heats up (wasted energy), gases exiting the cylinder are much hotter than air temperature (more wasted energy) and there’s friction‑generated heat all through the drive train (even more waste). Improving the drive train’s lubrication can reduce friction, but there’s no way to stop energy loss into heated-up combustion product molecules.

Two hundred years of effort haven’t uncovered a usable loophole in the Second Law. However, we have been able to quantify it. Especially for practically important chemical reactions, like burning gasoline, scientists can calculate how much energy the reaction product molecules will retain as heat. The energy available to do work is what’s left.

For historical reasons, the “available to do work” part is called “free energy.” Not free like running about like ball lightning, but free in the sense of not being bound up in jiggling heated‑up molecules.

Vacuum energy is just the opposite of free — it’s bound up in the structure of space itself. We’ve known for a century that atoms waggle back and forth within their molecules. Those vibrations give rise to the infrared spectra we use for remote temperature sensing and for studying planetary atmospheres. One of the basic results of quantum mechanics is that there’s a minimum amount of motion, called zero‑point vibration, that would persist even if the molecule were frozen to absolute zero temperature.

There are other kinds of zero‑point motion. We know of two phenomena, the Casimir effect and the Lamb shift, that can be explained by assuming that the electric field and other force fields “vibrate” at the ultramicroscopic scale even in the absence of matter. Not vibrations like going up and down, but like getting more and less intense. It’s possible that the same “vibrations” spark radioactive decay and some kinds of light emission.

Visualize space being marked off with a mesh of cubes. In each cube one or more fields more‑or‑less periodically intensify and then relax. The variation strength and timing are unpredictable. Neighboring squares may or may not sync up and that’s unpredictable, too.

The activity is all governed by yet another Heisenberg’s Uncertainty Principle trade‑off. The stronger the intensification, the less certain we can be about when or where the next one will happen.

What we can say is that whether you look at a large volume of space (even an atom is ultramicroscopicly huge) or a long period of time (a second might as well be a millennium), on the average the intensity is zero. All our energy‑using techniques involve channeling energy from a high‑potential source to a low‑potential sink. Vacuum energy sources are everywhere but so are the sinks and they all flit around. Catching lightning in a jar was easy by comparison.

Regards,
Sy Moire.

~~ Rich Olcott

Sisyphus on A Sand Dune

I’m walking the park’s paths on a lovely early Spring day when, “There you are, Moire. I got a question!”

“As you always do, Mr Feder. What’s your question this time?”

“OK, this guy’s saying that life is all about fighting entropy but entropy always increases anyway. I seen nothing in the news about us fighting entropy so where’s he get that? Why even bother if we’re gonna lose anyway? Where’s it coming from? Can we plug the holes?”

“That’s 4½ questions with a lot of other stuff hiding behind them. You’re going to owe me pizza at Eddie’s AND a double-dip gelato.”

“You drive a hard bargain, Moire, but you’re on.”

“Deal. Let’s start by clearing away some underbrush. You seem to have the idea that entropy’s a thing, like water, that it flows around and somehow seeps into our Universe. None of that’s true.”

“That makes no sense. How can what we’ve got here increase if it doesn’t come from somewhere?”

“Ah, I see the problem — conservation. Physicists say there are two kinds of quantities in the Universe — conserved and non‑conserved. The number of cards in a deck is is a conserved quantity because it’s always 52, right?”

“Unless you’re in a game with Eddie.”

“You’ve learned that lesson, too, eh? With Eddie the system’s not closed because he occasionally adds or removes a card. Unless we catch him at it and that’s when the shouting starts. So — cards are non-conserved if Eddie’s in the game. Anyway, energy’s a conserved quantity. We can change energy from one form to another but we can’t create or extinguish energy, OK?”

“I heard about that. Sure would be nice if we could, though — electricity outta nothing would save the planet.”

“It would certainly help, and so would making discarded plastic just disappear. Unfortunately, mass is another conserved quantity unless you’re doing subatomic stuff. Physicists have searched for other conserved quantities because they make calculations simpler. Momentum‘s one, if you’re careful how you define it. There’s about a dozen more. The mass of water coming out of a pipe exactly matches the mass that went in.”

“What if the pipe leaks?”

“Doesn’t matter where the water comes out. If you measure the leaked mass and the mass at the pipe’s designed exit point the total outflow equals the inflow. But that gets me to the next bit of underbrush. Energy’s conserved, that’s one of our bedrock rules, but energy always leaks and that’s another bedrock rule. The same rule also says that matter always breaks into smaller pieces if you give it a chance though that’s harder to calculate. We measure both leakages as entropy. Wherever you look, any process that converts energy or matter from one form to another diverts some fraction into bits of matter in random motion and that’s an increase of entropy. One kind of entropy, anyway.”

“Fine, but what’s all this got to do with life?”

“It’s all to get us to where we can talk about entropy in context. You’re alive, right?”

“Last I looked.”

“Ever break a bone?”

<taps his arm> “Sure, hasn’t everybody one time or another?”

“Healed up pretty well, I see. Congratulations. Right after the break that arm could have gone in lots of directions it’s not supposed to — a high entropy situation. So you wore a cast while your bone cells worked hard to knit you together again and lower that entropy. Meanwhile, the rest of your body kept those cells supplied with energy and swept away waste products. You see my point?”

“So what you’re saying is that mending a broken part uses up energy and creates entropy somewhere even though the broken part is less random. I got that.”

“Oh, it goes deeper than that. If you could tag one molecule inside a living cell you’d see it bouncing all over the place until it happens to move where something grabs it to do something useful. Entropy pushes towards chaos, but the cell’s pattern of organized activity keeps chaos in check. Like picnicking on a windy day — only constant vigilance maintains order. That’s the battle.”

“Hey, lookit, Eddie’s ain’t open. I’ll owe you.”

“Pizza AND double-dip gelato.”

~~ Rich Olcott

At The Old Curiosity Shop

An imposing knock at the door, both impetuous and imperious.  I figured it for an Internet denizen.  “C’mon in, the door’s open.”

“You’re Moire?”

“I am.  And you are..?”

“The name’s Feder, Richard Feder, from Fort Lee, NJ.  I’m a stand-in for some of your commenters.”

“Ah, the post of business past.  You have a question?”

“Yeah.  How come hot water can freeze faster than cold water?”

“That’s really two questions. The first is, ‘Can hot water freeze faster than cold water?’ and the second is, ‘How come?‘  To the surprise of a lot of physicists, the experimental answer to the first question is, ‘Yes, sometimes.‘  But it’s only sometimes and even that depends on how you define freeze.”

“What’s to define?  Frozen is frozen.”

“Not so fast.  Are we talking surface ice formation, or complete solidification, or maybe just descent to freezing temperature?  Three very different processes.  There’s multiple reports of anomalous behavior for each one, but many of the reports have been contested by other researchers.  Lots of explanations, too.  The situation reminds me of Anne’s Elephant.”

“Why an elephant?  And who’s Anne?”

“Remember the old story about the blind men trying to figure out an elephant?  The guy touching its trunk said it’s a snake, the one at its side said it’s a wall, the dude at its leg said it’s a tree, and so on?  The descriptions differed because each observer had limited knowledge of something complicated.  This chilled-water issue is like that — irreproducible experiments because of uncontrolled unknown variables, mostly maybes on the theory side because we’re still far from a fundamental understanding.”

“Who’s Anne?”

“Anne is … an experience.  I showed her how the notion of Entropy depends on how you look at it.  Scientists have looked at this paradoxical cooling effect pretty much every way you can think of, trying to rule out various hypotheses.  Different teams have both found and not found the anomaly working with distilled water and with tap water, large amounts and small, in the open air and in sealed containers, in glass or metal containers, with and without stirring, with various pre-washing regimens or none, using a variety of initial and final temperatures.  They’ve clocked the first appearance of surface ice and complete opacity of the bulk.  They’ve tracked temperature’s trajectory in the middle of the container or near its wall… you name it.  My favorite observation was the 20th Century’s first-published one — in 1963 Erasto Mpemba noticed the effect while preparing ice cream.”

“What flavor?  Never mind.  Is there a verdict?”

“Vaguely.  Once you get approximately the right conditions, whether or not you see the effect seems to be a matter of chance.  The more sophisticated researchers have done trials in the hundreds and then reported percentages, rather than just ‘we see it’ or not.  Which in itself is interesting.”many elephants

“How’s that?”

“Well, to begin with, the percents aren’t zero.  That answers your first question — warm water sometimes does freeze faster than cold.  Better yet, the variability tells us that the answer to your second question is at the nanoscopic level.  Macroscopic processes, even chemical ones, have statistics that go the same way all the time.  Put a lit match to gasoline in air, you’ll always get a fire.  But if you set out 100 teaspoons of water under certain conditions and 37 of them freeze and the others don’t, something very unusual must be going on that starts with just a few molecules out of the 10²³ in those teaspoons.”

“Weird odds.”

This experiment’s even more interesting.  You’ve got two bottles of water.  You heat up bottle A and let it cool to room temperature.  B‘s been at room temperature all along.  You put ’em both in the fridge and track their temperatures.  A cools quicker.”

“That’s where I came in.”

“Both start at the same temperature, finish at the same temperature, and their Joules-per-second energy-shedding rates should be the same.  A cools in less time so A releases less heat.  Entropy change is released heat energy divided by temperature.  Somehow, bottle A went into the fridge with less entropy than B had.  Why?  We don’t really know.”

~~ Rich Olcott

  • – Thanks to Ilias Tirovolas, whose paper inspired this post.

Meanwhile, back at the office

Closing time.  Anne and I stroll from Al’s coffee shop back to the Acme Building.  It’s a clear night with at least 4,500 stars, but Anne’s looking at the velvet black between them.

“What you said, Sy, about the Universe not obeying Conservation of Energy — tell me more about that.”

“Aaa-hmmm … OK.  You’ve heard about the Universe expanding, right?”

“Ye-es, but I don’t know why that happens.”

“Neither do the scientists, but there’s pretty firm evidence that it’s happening, if only at the longest scales.  Stars within galaxies get closer together as they radiate away their gravitational energy.  But the galaxies themselves are getting further apart, as far out as we can measure.”

“What’s that got to do with Conservation of Energy?”

“Well, galaxies have mass so they should be drawn together by gravity the way that gravity pulls stars together inside galaxies.  But that’s not what’s happening.  Something’s actively pushing galaxies or galaxy clusters away from each other.  Giving the something a name like ‘dark energy‘ is just an accounting gimmick to pretend the First Law is still in effect at very large distances — we don’t know the energy source for the pushing, or even if there is one.  There’s a separate set of observations we attribute to a ‘dark energy‘ that may or may not have the same underlying cause.  That’s what I was talking about.”Fading white satin

We’re at the Acme Building.  I flash my badge to get us past Security and into the elevator.  As I reach out to press the ’12’ button she puts her hand on my arm.  “Sy, I want to see if I understand this entropy-elephant thing.  You said entropy started as an accounting gimmick, to help engineers keep track of fuel energy escaping into the surroundings.  Energy absorbed at one temperature they called the environment’s heat capacity.  Total energy absorbed over a range of temperatures, divided by the difference in temperature, they called change in entropy.”

The elevator lets us out on my floor and we walk to door 1217.  “You’ve got it right so far, Anne.  Then what?”

“Then the chemists realized that you can predict how lots of systems will work from only knowing a certain set of properties for the beginning and end states.  Pressure, volume, chemical composition, whatever, but also entropy.  But except for simple gases they couldn’t predict heat capacity or entropy, only measure it.”

My key lets us in.  She leans back against the door frame.  “That’s where your physicists come in, Sy.  They learned that heat in a substance is actually the kinetic energy of its molecules.  Gas molecules can move around, but that motion’s constrained in liquids and even more constrained in solids.  Going from solid to liquid and from liquid to gas absorbs heat energy in breaking those constraints.  That absorbed heat appears as increased entropy.”

She’s lounging against my filing cabinet.  “The other way that substances absorb heat is for parts of molecules to rotate and vibrate relative to other parts.  But there are levels.  Some vibrations excite easier than others, and many rotations are even easier.  In a cold material only some motions are active.  Rising temperature puts more kinds of motion into play.  Heat energy spreads across more and more sub-molecular absorbers.”

She’s perched on the edge of my desk.  “Here’s where entropy as possibility-counting shows up.  More heat, more possibilities, more entropy.  Now we can do arithmetic and prediction instead of measuring.  Anything you can count possibilities for you can think about defining an entropy for, like information bits or black holes or socks.  But it’ll be a different entropy, with its own rules and its own range of validity.  … And…”Riding the Elephant

She’s looming directly over me.  Her dark eyes are huge.

“And…?”

When we first met, Sy, you asked what you could do for me.  You’ve helped me see that when I travel across time and probability I’m riding the Entropy Elephant.  I’d like to show my appreciation.  Can you think of a possibility?”

A dark night, in a city that knows how to keep its secrets.  On the 12th floor of the Acme Building, one man still tries to answer the Universe’s persistent questions — Sy Moire, Physics Eye.

~~ Rich Olcott

Thoughts of Chair-man Moire

My apples and orange peels question, Sy,  isn’t that the same as Jeremy’s?  What’s the connection between heat capacity and counting?”

“You’re right, Anne.  Hmm.  Say, Al, all your coffee shop tables came with four chairs apiece, right?”

“Yup, four-tops every one, even in the back room.”

“You neaten them all up, four to a table, in the morning?”

“The night before.  There’s never time in the morning, customers demand coffee first thing.”

“But look, we’ve got six people seated at this table.  Where’d the extra chairs come from?”

“Other tables, of course.  Is this going somewhere?”

“Almost there.  So in fact the state of the room at any time will have some random distribution of chairs to tables.  You know on the average there’ll be four at a table, but you don’t know the actual distribution until you look, right?”

“Hey, we’re counting again.  You’re gonna say that’s about entropy ’cause the difference between four at a table and some other number is all random and there’s some formula to calculate entropy from that.”elephants and chairs

“True, Vinnie, but we’re about to take the next step.  How did these chairs wind up around this table?”

“We pulled them over, Mr. Moire.”

“My point is, Jeremy, we spent energy to get them here.  The more chairs that are out of position — ”

“The higher the entropy, but also the more energy went into the chairs.  It’s like that heat capacity thing we started with, the energy that got absorbed rather than driving the steam engine.”

“Awright, Anne!” from Jeremy <Jennie bristles a bit>, “and if all the chairs are in Al’s overnight position it’s like absolute zero.  Hey, temperature is average kinetic energy per particle so can we say that the more often a chair gets moved it’s like hotter?”

Jennie breaks in.  “Not a bit of it, Jeremy!  The whole metaphor’s daft.  We know temperature change times heat capacity equals the energy absorbed, right, and we’ve got a link between energy absorption and entropy, right, but what about if at the end of the day all the chairs accidentally wind up four at a table?  Entropy change is zero, right, but customers expended energy moving chairs about all day and Al’s got naught to set straight.”

“Science in action, I love it!  Anne and Jeremy, you two just bridged a gap it took Science a century to get across.  Carnot started us on entropy’s trail in 1824 but scientists in those days weren’t aware of matter’s atomic structure.  They knew that stuff can absorb heat but they had no inkling what did the absorbing or how that worked.  Thirty years later they understood simple gases better and figured out that average kinetic energy per particle bit.  But not until the 1920s did we have the quantum mechanics to show how parts of vibrating molecules can absorb heat energy stepwise like a table ‘absorbing’ chairs.  Only then could we do Vinnie’s state-counting to calculate entropies.”

“Yeah, more energy, spread across more steps, hiding more details we don’t know behind an average, more entropy.  But what about Jennie’s point?”

“Science is a stack of interconnected metaphors, Vinnie.  Some are better than others.  The trick is attending to the boundaries where they stop being valid.  Jennie’s absolutely correct that my four-chair argument is only a cartoon for illustrating stepwise energy accumulation.  If Al had a billion tables instead of a dozen or so, the odds on getting everything back to the zero state would disappear into rounding error.”

“How does black hole entropy play into this, Sy?”TSE classical vs BH

“Not very well, actually.  Oh, sure, the two systems have similar structures.  They’ve each got three inter-related central quantities constrained by three laws.  Here, I’ve charted them out on Old Reliable.”

“OK, their Second and Third Laws look pretty much the same, but their First Laws don’t match up.”

“Right, Al.  And even Bekenstein pointed out inconsistencies between classic thermodynamic temperature and what’s come to be called Hawking temperature.  Hawking didn’t agree.  The theoreticians are still arguing.  Here’s a funny one — if you dig deep enough, both versions of the First Law are the same, but the Universe doesn’t obey it.”

“That’s it, closing time.  Everybody out.”

~~ Rich Olcott

Taming The Elephant

Suddenly they were all on the attack.  Anne got in the first lick.  “C’mon, Sy, you’re comparing apples and orange peel.  Your hydrogen sphere would be on the inside of the black hole’s event horizon, and Jeremy’s virtual particles are on the outside.”

[If you’ve not read my prior post, do that now and this’ll make more sense.  Go ahead, I’ll wait here.]white satin and 5 elephantsJennie’s turn — “Didn’t the chemists define away a whole lot of entropy when they said that pure elements have zero entropy at absolute zero temperature?”

Then Vinnie took a shot.  “If you’re counting maybe-particles per square whatever for the surface, shouldn’t you oughta count maybe-atoms or something per cubic whatever for the sphere?”

Jeremy posed the deepest questions. “But Mr Moire, aren’t those two different definitions for entropy?  What does heat capacity have to do with counting, anyhow?”

Al brought over mugs of coffee and a plate of scones.  “This I gotta hear.”

“Whew, but this is good ’cause we’re getting down to the nub.  First to Jennie’s point — Under the covers, Hawking’s evaluation is just as arbitrary as the chemists’.  Vinnie’s ‘whatever’ is the Planck length, lP=1.616×10-35 meter.  It’s the square root of such a simple combination of fundamental constants that many physicists think that lP2=2.611×10-70 m², is the ‘quantum of area.’  But that’s just a convenient assumption with no supporting evidence behind it.”

“Ah, so Hawking’s ABH=4πrs2 and SBH=ABH/4 formulation with rs measured in Planck-lengths, just counts the number of area-quanta on the event horizon’s surface.”

“Exactly, Jennie.  If there really is a least possible area, which a lot of physicists doubt, and if its size doesn’t happen to equal lP2, then the black hole entropy gets recalculated to match.”

“So what’s wrong with cubic those-things?”

“Nothing, Vinnie, except that volumes measured in lP3 don’t apply to a black hole because the interior’s really four-dimensional with time scrambled into the distance formulas.  Besides, Hawking proved that the entropy varies with half-diameter squared, not half-diameter cubed.”

“But you could still measure your hydrogen sphere with them and that’d get rid of that 1033 discrepancy between the two entropies.”

“Not really, Vinnie.  Old Reliable calculated solid hydrogen’s entropy for a certain mass, not a volume.”

“Hawking can make his arbitrary choice, Sy, he’s Hawking, but that doesn’t let the chemists off the scaffold.  How did they get away with arbitrarily defining a zero for entropy?”

“Because it worked, Jennie.  They were only concerned with changes — the difference between a system’s state at the end of a process, versus its state at the beginning.  It was only the entropy difference that counted, not its absolute value.”

“Hey, like altitude differences in potential energy.”

“Absolutely, Vinnie, and that’ll be important when we get to Jeremy’s question.  So, Jennie, if you’re only interested in chemical reactions and if it’s still in the 19th Century and the world doesn’t know about isotopes yet, is there a problem with defining zero entropy to be at a convenient set of conditions?”

“Well, but Vinnie’s Second Law says you can never get down to absolute zero so that’s not convenient.”

“Good point, but the Ideal Gas Law and other tools let scientists extrapolate experimentally measured properties down to extremely low temperatures.  In fact, the very notion of absolute zero temperature came from experiments where the volume of a  hydrogen or helium gas sample appears to decrease linearly towards zero at that temperature, at least until the sample condenses to a liquid.  With properly calibrated thermometers, physical chemists knocked themselves out measuring heat capacities and entropies at different temperatures for every substance they could lay hands on.”

“What about isotopes, Mr Moire?  Isn’t chlorine’s atomic weight something-and-a-half so there’s gotta be several of kinds of chlorine atoms so any sample you’ve got is a mixture and that’s random and that has to have a non-zero entropy even at absolute zero.”

“It’s 35.4, two stable isotopes, Jeremy, but we know how to account for entropy of mixing and anyway, the isotope mix rarely changes in chemical processes.”

“But my apples and orange peels, Sy — what does the entropy elephant do about them?”

~~ Rich Olcott