The Latte Connection

An early taste of Spring’s in the air so Al’s set out tables in front of his coffee shop. I’m enjoying my usual black mud when the Chemistry Department’s Susan Kim passes by carrying her usual mocha latte. “Hi, Sy, mind if I take the socially distant chair at your table?”

“Be my guest, Susan. What’s going on in your world?”

“I’ve been enjoying your hysteresis series. It took me back to Physical Chemistry class. I’m intrigued by how you connected it to entropy.”

“How so?”

“I think of hysteresis as a process, but entropy is a fixed property of matter. If I’m holding twelve grams of carbon at room temperature, I know what its entropy is.”

“Mmm, sorta. Doesn’t it make a difference whether the carbon’s a 60‑carat diamond or just a pile of soot?”

“OK, I’ll give you that, the soot’s a lot more random than the diamond so its entropy is higher. The point remains, I could in principle measure a soot sample’s heat capacity at some convenient temperature and divide that by the temperature. I could repeat that at lower and lower temperatures down to near absolute zero. When I sum all those measurements I’ll have the entropy content of the sample at my starting temperature.”

“A classical definition, just what I’d expect from a chemist. But suppose your soot spills out of its test tube and the breeze spreads it all over the neighborhood. More randomness, higher entropy than what you measured, right?”

“Well, yes. I wouldn’t have a clue how to calculate it, but that goes way beyond Carnot’s and Clausius’ original concept.”

“So entropy has at least a thin linkage with history and hysteresis. To you chemists, though, an element or compound is timeless — lead or water have always been lead or water, and their physical constants are, well, constant.”

“Not quite true, Sy. Not with really big molecules like proteins and DNA and rubber and some plastics. Squirt a huge protein like catalase through a small orifice and its properties change drastically. It might not promote any reaction, much less the one Nature designed it for. Which makes me think — Chemistry is all about reactions and they take time and studying what makes reactions run fast or slow is a big part of the field. So we do pay attention to time.”

“Nice play, Susan! You’re saying small molecules aren’t complex enough to retain memories but big ones are. I’ll bet big molecules probably exhibit hysteresis.”

“Sure they do. Rubber molecules are long-chain polymers. Quickly stretch a rubber band to its limit, hold it there a few seconds then let go. Some of the molecular strands lock into the stretched configuration so the band won’t immediately shrink all the way down to its original size. There’s your molecular memory.”

“And a good example it is — classic linear Physics. How much force you exert, times the distance you applied it through, equals the energy you expended. Energy’s stored in the rubber’s elasticity when you stretch it, and the energy comes back out on release.”

“Mostly right, Sy. You actually have to put in more energy than you get out — Second Law of Thermodynamics, of course — and the relationship’s not linear. <rummaging into purse> Thought I had a good fat rubber band somewhere … ah‑hah! Here, stretch this out while you hold it against your forehead. Feel it heat up briefly? Now keep checking for heat while you relax the band.”

“Hey, it got cold for a second!”

“Yep. The stretched-out configuration is less random so its entropy and heat capacity are lower than the relaxed configuration’s. The stretched band had the same amount of heat energy but with less heat required per degree of temperature, that amount of energy made the band hotter. Relaxing the band let its molecules get less orderly. Heat capacity went back up. temperature went back down.”

“Mmm-HM. My hysteresis diagram’s upward branch is stretch energy input and the downward branch is elastic energy output. The energy difference is the area inside the hysteresis curve, which is what’s lost to entropy in each cycle and there we have your intriguing entropy‑hysteresis connection. Still intrigued?”

“Enough for another latte.”

~~ Rich Olcott

Elephant And Pengy

(a hat-tip to Mo Willems, whose Elephant and Piggy books helped my grandkids discover reading)

“Hey, Sy, how come my magnetized nail’s hysteresis loop is so wide? It makes sense that the end‑case magnetizing happens because all the iron atoms get lined up in one direction or the other. But why ain’t the blue up‑curve right on top of the down‑curve?”

“Why do you think it should be, Vinnie?”

“Well, the red curve’s different because you got the outside field herding the iron atoms into domains where they all point the same way and that makes the nail’s magnetism grow from zero, and then the domains that agree with the outside magnetic field eat up the other domains until like I said they saturate. But on both sides of the blue loop the domains already exist, right, so the herding’s all done. Up or down it’s only domains growing and shrinking. Seems to me that the curves oughta be the same.”

“They are, near as I could draw them. You’re just not looking at them right. Rotate it 180°, see how they match up.”

“How ’bout that, they do, mostly. What’s going on?”

“You picked up that the vertical axis represents strength and direction, but you missed that the horizontal axis also represents strength and direction. Neither axis starts at zero, they’re both centered on zero. The driver is the outside magnetic field. No strength in the middle, increasing north‑bound strength to the right, south‑bound strength to the left. Start at the head‑end‑north corner and go down branch 2. The north‑bound driver strength decreases. That relaxes some of those north‑pointing domains and the nail’s net magnetism decreases just a bit. When the outside field’s strength gets down to neutral, about at the upper arrow, the nail’s still strongly magnetized. Most of the domains remember which way they were pointing. That’s the history that makes this hysteresis. The domains stay there until the outside field gets strong enough south‑bound to make a difference. That grows the south‑bound domains at the expense of northbound ones. All that goes on until we get to saturation at head‑end‑south corner and then we run exactly the reverse sequence. For most materials, the two extreme fields have the same strength, just opposite directions.”

“Wait, you said ‘for most materials.’ Different materials have different widths on that picture?”

“Good catch. Yes, there’s ‘hard‘ ones like rare earth magnets. They have a really wide hysteresis loop you can’t demagnetize without a really strong field. That’s good for where you want a permanent magnet that you don’t want to have to recalibrate, like on a spacecraft bound for Jupiter. You’d want a ‘soft magnet‘ with a narrow hysteresis loop for something like a transformer core that has to switch polarity sixty times a second.”

<longish contemplative silence> “Sy, I just got a great idea! And it uses that entropy elephant stuff you wrote about.”

“All right, out with it.”

“OK. When the nail is magnetized, it’s got all or at least most of its iron atoms pointing in the same direction, right? And when the outside field demagnetizes it, the atoms point all over the place, right? So the not‑magnetized nail has randomness, that’s entropy, and the magnetized one doesn’t. Where did the entropy come from? Gotta be from the outside, right? Can we use this to like suck entropy out of things?”

“Right, right and sorta right. I’m not happy with the idea of pumping entropy around. What’s really in play is energy, sometimes as magnetic field energy and sometimes as heat. You’ve got the core idea for a magnetic refrigerator. Put a field‑magnetized transfer material in contact with what you want to cool, then turn off the outside field. Heat from the target flows into the material, jiggles the atoms and scrambles the magnetization. Break the contact, cool and re‑magnetize the material and repeat. The idea’s been around since the late 1800s. The problem has been finding the right material to make it work. The best stuff has a tall, narrow hysteresis loop so it can be strongly magnetized yet forget it easily. Researchers have finally found some good candidates.”

“Too late to the party, huh?”

“Sorry.”

~~ Rich Olcott

A Beetled Brow

Vinnie’s brow was wrinkling so hard I could hear it over the phone. “Boltzmann, Boltzmann, where’d I hear that name before? … Got it! That’s one of those constants, ain’t it, Sy? Molecules or temperature or something?”

“The second one, Vinnie. Avagadro was the molecule counter. Good memory. Come to think of it, both Boltzmann and Avagadro bridged gaps that Loschmidt worked on.”

“Loschmidt’s thing was the paradox, right, between Newton saying events can back up and thermodynamics saying no, they can’t. You said Boltzmann’s Statistical Mechanics solved that, but I’m still not clear how.”

“Let me think of an example. … Ah, you’ve got those rose bushes in front of your place. I’ll bet you’ve also put up a Japanese beetle trap to protect them.”

“Absolutely. Those bugs would demolish my flowers. The trap’s lure draws them away to my back yard. Most of them stay there ’cause they fall into the trap’s bag and can’t get out.”

“Glad it works so well for you. OK, Newton would look at individual beetles. He’d see right off that they fly mostly in straight lines. He’d measure the force of the wind and write down an equation for how the wind affects a beetle’s flight path. If the wind suddenly blew in the opposite direction, that’d be like the clock running backwards. His same equation would predict the beetle’s new flight path under the changed conditions. You with me?”

“Yeah, no problem.”

“Boltzmann would look at the whole swarm. He’d start by evaluating the average point‑to‑point beetle flight, which he’d call ‘mean free path.’ He’d probably focus on the flight speed and in‑the‑air time fraction. With those, if you tell him how many beetles you’ve got he could generate predictions like inter‑beetle separation and how long it’d take an incoming batch of beetles to cross your yard. However, predicting where a specific beetle will land next? Can’t do that.”

“Who cares about one beetle?”

“Well, another beetle might. …
Just thought of a way that Statistical Mechanics could actually be useful in this application. Once Boltzmann has his numbers for an untreated area, you could put in a series of checkpoints with different lures. Then he could develop efficiency parameters just by watching the beetle flying patterns. No need to empty traps. Anyhow, you get the idea.”

Japanese Beetle, photo by David Cappaert, Bugwood.org
under Creative Commons BY 3.0

“Hey, I feel good emptying that trap, I’m like standing up for my roses. Anyway, so how does Avagadro play into this?”

“Indirectly and he was half a century earlier. In 1805 Gay‑Lussac showed that if you keep the pressure and temperature constant, it tales two volumes of hydrogen to react with one volume of oxygen to produce one volume of water vapor. Better, the whole‑number‑ratio rule seemed to hold generally. Avagadro concluded that the only way Gay‑Lussac’s rule could be general is if at any temperature and pressure, equal volumes of every kind of gas held the same number of molecules. He didn’t know what that number was, though.”

“HAW! Avagadro’s number wasn’t a number yet.”

“Yeah, it took a while to figure out. Then in 1865, Loschmidt and a couple of others started asking, “How big is a gas molecule?” Some gases can be compressed to the liquid state. The liquids have a definite volume, so the scientists knew molecules couldn’t be infinitely small. Loschmidt put numbers to it. Visualize a huge box of beetles flying around, bumping into each other. Each beetle, or molecule, ‘occupies’ a cylinder one beetle wide and the length of its mean free path between collisions. So you’ve got three volumes — the beetles, the total of all the cylinders, and the much larger box. Loschmidt used ratios between the volumes, plus density data, to conclude that air molecules are about a nanometer wide. Good within a factor of three. As a side result he calculated the number of gas molecules per unit volume at any temperature and pressure. That’s now called Loschmidt’s Number. If you know the molecular weight of the gas, then arithmetic gives you Avagadro’s number.”

“Thinking about a big box of flying, rose‑eating beetles creeps me out.”

  • Thanks to Oriole Hart for the story‑line suggestion.

~~ Rich Olcott

Bridging A Paradox

<chirp, chirp> “Moire here.”

“Hi, Sy. Vinnie. Hey, I’ve been reading through some of your old stuff—”

“That bored, eh?”

“You know it. Anyhow, something just don’t jibe, ya know?”

“I’m not surprised but I don’t know. Tell me about it.”

“OK, let’s start with your Einstein’s Bubble piece. You got this electron goes up‑and‑down in some other galaxy and sends out a photon and it hits my eye and an atom in there absorbs it and I see the speck of light, right?”

“That’s about the size of it. What’s the problem?”

“I ain’t done yet. OK, the photon can’t give away any energy on the way here ’cause it’s quantum and quantum energy comes in packages. And when it hits my eye I get the whole package, right?”

“Yes, and?”

“And so there’s no energy loss and that means 100% efficient and I thought thermodynamics says you can’t do that.”

“Ah, good point. You’ve just described one version of Loschmidt’s Paradox. A lot of ink has gone into the conflict between quantum mechanics and relativity theory, but Herr Johann Loschmidt found a fundamental conflict between Newtonian mechanics, which is fundamental, and thermodynamics, which is also fundamental. He wasn’t talking photons, of course — it’d be another quarter-century before Planck and Einstein came up with that notion — but his challenge stood on your central issue.”

“Goody for me, so what’s the central issue?”

“Whether or not things can run in reverse. A pendulum that swings from A to B also swings from B to A. Planets go around our Sun counterclockwise, but Newton’s math would be just as accurate if they went clockwise. In all his equations and everything derived from them, you can replace +t with ‑t to make run time backwards and everything looks dandy. That even carries over to quantum mechanics — an excited atom relaxes by emitting a photon that eventually excites another atom, but then the second atom can play the same game by tossing a photon back the other way. That works because photons don’t dissipate their energy.”

“I get your point, Newton-style physics likes things that can back up. So what’s Loschmidt’s beef?”

“Ever see a fire unburn? Down at the microscopic level where atoms and photons live, processes run backwards all the time. Melting and freezing and chemical equilibria depend upon that. Things are different up at the macroscopic level, though — once heat energy gets out or randomness creeps in, processes can’t undo by themselves as Newton would like. That’s why Loschmidt stood the Laws of Thermodynamics up against Newton’s Laws. The paradox isn’t Newton’s fault — the very idea of energy was just being invented in his time and of course atoms and molecules and randomness were still centuries away.”

“Micro, macro, who cares about the difference?”

“The difference is that the micro level is usually a lot simpler than the macro level. We can often use measured or calculated micro‑level properties to predict macro‑level properties. Boltzmann started a whole branch of Physics, Statistical Mechanics, devoted to carrying out that strategy. For instance, if we know enough about what happens when two gas molecules collide we can predict the speed of sound through the gas. Our solid‑state devices depend on macro‑level electric and optical phenomena that depend on micro‑level electron‑atom interactions.”

“Statistical?”

“As in, ‘we don’t know exactly how it’ll go but we can figure the odds…‘ Suppose we’re looking at air molecules and the micro process is a molecule moving. It could go left, right, up, down, towards or away from you like the six sides of a die. Once it’s gone left, what are the odds it’ll reverse course?”

“About 16%, like rolling a die to get a one.”

“You know your odds. Now roll that die again. What’s the odds of snake‑eyes?”

“16% of 16%, that’s like 3 outa 100.”

“There’s a kajillion molecules in the room. Roll the die a kajillion times. What are the odds all the air goes to one wall?”

“So close to zero it ain’t gonna happen.”

“And Boltzmann’s Statistical Mechanics explained why not.”

“Knowing about one molecule predicts a kajillion. Pretty good.”

San Francisco’s Golden Gate Bridge, looking South
Photo by Rich Niewiroski Jr. / CC BY 2.5

~~ Rich Olcott

Free Energy, or Not

From: Richard Feder <rmfeder@fortleenj.com>
To: Sy Moire <sy@moirestudies.com>
Subj: Questions

What’s this about “free energy”? Is that energy that’s free to move around anywhere? Or maybe the vacuum energy that this guy said is in the vacuum of space that will transform the earth into a wonderful world of everything for free for everybody forever once we figure out how to handle the force fields and pull energy out of them?


From: Sy Moire <sy@moirestudies.com>
To: Richard Feder <rmfeder@fortleenj.com>

Subj: Re: Questions

Well, Mr Feder, as usual you have a lot of questions all rolled up together. I’ll try to take one at a time.

It’s clear you already know that to make something happen you need energy. Not a very substantial definition, but then energy is an abstract thing it took humanity a couple of hundred years to get our minds around and we’re still learning.

Physics has several more formal definitions for “energy,” all clustered around the ability to exert force to move something and/or heat something up. The “and/or” is the kicker, because it turns out you can’t do just the moving. As one statement of the Second Law of Thermodynamics puts it, “There are no perfectly efficient processes.”

For example, when your car’s engine burns a few drops of gasoline in the cylinder, the liquid becomes a 22000‑times larger volume of hot gas that pushes the piston down in its power stroke to move the car forward. In the process, though, the engine heats up (wasted energy), gases exiting the cylinder are much hotter than air temperature (more wasted energy) and there’s friction‑generated heat all through the drive train (even more waste). Improving the drive train’s lubrication can reduce friction, but there’s no way to stop energy loss into heated-up combustion product molecules.

Two hundred years of effort haven’t uncovered a usable loophole in the Second Law. However, we have been able to quantify it. Especially for practically important chemical reactions, like burning gasoline, scientists can calculate how much energy the reaction product molecules will retain as heat. The energy available to do work is what’s left.

For historical reasons, the “available to do work” part is called “free energy.” Not free like running about like ball lightning, but free in the sense of not being bound up in jiggling heated‑up molecules.

Vacuum energy is just the opposite of free — it’s bound up in the structure of space itself. We’ve known for a century that atoms waggle back and forth within their molecules. Those vibrations give rise to the infrared spectra we use for remote temperature sensing and for studying planetary atmospheres. One of the basic results of quantum mechanics is that there’s a minimum amount of motion, called zero‑point vibration, that would persist even if the molecule were frozen to absolute zero temperature.

There are other kinds of zero‑point motion. We know of two phenomena, the Casimir effect and the Lamb shift, that can be explained by assuming that the electric field and other force fields “vibrate” at the ultramicroscopic scale even in the absence of matter. Not vibrations like going up and down, but like getting more and less intense. It’s possible that the same “vibrations” spark radioactive decay and some kinds of light emission.

Visualize space being marked off with a mesh of cubes. In each cube one or more fields more‑or‑less periodically intensify and then relax. The variation strength and timing are unpredictable. Neighboring squares may or may not sync up and that’s unpredictable, too.

The activity is all governed by yet another Heisenberg’s Uncertainty Principle trade‑off. The stronger the intensification, the less certain we can be about when or where the next one will happen.

What we can say is that whether you look at a large volume of space (even an atom is ultramicroscopicly huge) or a long period of time (a second might as well be a millennium), on the average the intensity is zero. All our energy‑using techniques involve channeling energy from a high‑potential source to a low‑potential sink. Vacuum energy sources are everywhere but so are the sinks and they all flit around. Catching lightning in a jar was easy by comparison.

Regards,
Sy Moire.

~~ Rich Olcott

Sisyphus on A Sand Dune

I’m walking the park’s paths on a lovely early Spring day when, “There you are, Moire. I got a question!”

“As you always do, Mr Feder. What’s your question this time?”

“OK, this guy’s saying that life is all about fighting entropy but entropy always increases anyway. I seen nothing in the news about us fighting entropy so where’s he get that? Why even bother if we’re gonna lose anyway? Where’s it coming from? Can we plug the holes?”

“That’s 4½ questions with a lot of other stuff hiding behind them. You’re going to owe me pizza at Eddie’s AND a double-dip gelato.”

“You drive a hard bargain, Moire, but you’re on.”

“Deal. Let’s start by clearing away some underbrush. You seem to have the idea that entropy’s a thing, like water, that it flows around and somehow seeps into our Universe. None of that’s true.”

“That makes no sense. How can what we’ve got here increase if it doesn’t come from somewhere?”

“Ah, I see the problem — conservation. Physicists say there are two kinds of quantities in the Universe — conserved and non‑conserved. The number of cards in a deck is is a conserved quantity because it’s always 52, right?”

“Unless you’re in a game with Eddie.”

“You’ve learned that lesson, too, eh? With Eddie the system’s not closed because he occasionally adds or removes a card. Unless we catch him at it and that’s when the shouting starts. So — cards are non-conserved if Eddie’s in the game. Anyway, energy’s a conserved quantity. We can change energy from one form to another but we can’t create or extinguish energy, OK?”

“I heard about that. Sure would be nice if we could, though — electricity outta nothing would save the planet.”

“It would certainly help, and so would making discarded plastic just disappear. Unfortunately, mass is another conserved quantity unless you’re doing subatomic stuff. Physicists have searched for other conserved quantities because they make calculations simpler. Momentum‘s one, if you’re careful how you define it. There’s about a dozen more. The mass of water coming out of a pipe exactly matches the mass that went in.”

“What if the pipe leaks?”

“Doesn’t matter where the water comes out. If you measure the leaked mass and the mass at the pipe’s designed exit point the total outflow equals the inflow. But that gets me to the next bit of underbrush. Energy’s conserved, that’s one of our bedrock rules, but energy always leaks and that’s another bedrock rule. The same rule also says that matter always breaks into smaller pieces if you give it a chance though that’s harder to calculate. We measure both leakages as entropy. Wherever you look, any process that converts energy or matter from one form to another diverts some fraction into bits of matter in random motion and that’s an increase of entropy. One kind of entropy, anyway.”

“Fine, but what’s all this got to do with life?”

“It’s all to get us to where we can talk about entropy in context. You’re alive, right?”

“Last I looked.”

“Ever break a bone?”

<taps his arm> “Sure, hasn’t everybody one time or another?”

“Healed up pretty well, I see. Congratulations. Right after the break that arm could have gone in lots of directions it’s not supposed to — a high entropy situation. So you wore a cast while your bone cells worked hard to knit you together again and lower that entropy. Meanwhile, the rest of your body kept those cells supplied with energy and swept away waste products. You see my point?”

“So what you’re saying is that mending a broken part uses up energy and creates entropy somewhere even though the broken part is less random. I got that.”

“Oh, it goes deeper than that. If you could tag one molecule inside a living cell you’d see it bouncing all over the place until it happens to move where something grabs it to do something useful. Entropy pushes towards chaos, but the cell’s pattern of organized activity keeps chaos in check. Like picnicking on a windy day — only constant vigilance maintains order. That’s the battle.”

“Hey, lookit, Eddie’s ain’t open. I’ll owe you.”

“Pizza AND double-dip gelato.”

~~ Rich Olcott

At The Old Curiosity Shop

An imposing knock at the door, both impetuous and imperious.  I figured it for an Internet denizen.  “C’mon in, the door’s open.”

“You’re Moire?”

“I am.  And you are..?”

“The name’s Feder, Richard Feder, from Fort Lee, NJ.  I’m a stand-in for some of your commenters.”

“Ah, the post of business past.  You have a question?”

“Yeah.  How come hot water can freeze faster than cold water?”

“That’s really two questions. The first is, ‘Can hot water freeze faster than cold water?’ and the second is, ‘How come?‘  To the surprise of a lot of physicists, the experimental answer to the first question is, ‘Yes, sometimes.‘  But it’s only sometimes and even that depends on how you define freeze.”

“What’s to define?  Frozen is frozen.”

“Not so fast.  Are we talking surface ice formation, or complete solidification, or maybe just descent to freezing temperature?  Three very different processes.  There’s multiple reports of anomalous behavior for each one, but many of the reports have been contested by other researchers.  Lots of explanations, too.  The situation reminds me of Anne’s Elephant.”

“Why an elephant?  And who’s Anne?”

“Remember the old story about the blind men trying to figure out an elephant?  The guy touching its trunk said it’s a snake, the one at its side said it’s a wall, the dude at its leg said it’s a tree, and so on?  The descriptions differed because each observer had limited knowledge of something complicated.  This chilled-water issue is like that — irreproducible experiments because of uncontrolled unknown variables, mostly maybes on the theory side because we’re still far from a fundamental understanding.”

“Who’s Anne?”

“Anne is … an experience.  I showed her how the notion of Entropy depends on how you look at it.  Scientists have looked at this paradoxical cooling effect pretty much every way you can think of, trying to rule out various hypotheses.  Different teams have both found and not found the anomaly working with distilled water and with tap water, large amounts and small, in the open air and in sealed containers, in glass or metal containers, with and without stirring, with various pre-washing regimens or none, using a variety of initial and final temperatures.  They’ve clocked the first appearance of surface ice and complete opacity of the bulk.  They’ve tracked temperature’s trajectory in the middle of the container or near its wall… you name it.  My favorite observation was the 20th Century’s first-published one — in 1963 Erasto Mpemba noticed the effect while preparing ice cream.”

“What flavor?  Never mind.  Is there a verdict?”

“Vaguely.  Once you get approximately the right conditions, whether or not you see the effect seems to be a matter of chance.  The more sophisticated researchers have done trials in the hundreds and then reported percentages, rather than just ‘we see it’ or not.  Which in itself is interesting.”many elephants

“How’s that?”

“Well, to begin with, the percents aren’t zero.  That answers your first question — warm water sometimes does freeze faster than cold.  Better yet, the variability tells us that the answer to your second question is at the nanoscopic level.  Macroscopic processes, even chemical ones, have statistics that go the same way all the time.  Put a lit match to gasoline in air, you’ll always get a fire.  But if you set out 100 teaspoons of water under certain conditions and 37 of them freeze and the others don’t, something very unusual must be going on that starts with just a few molecules out of the 10²³ in those teaspoons.”

“Weird odds.”

This experiment’s even more interesting.  You’ve got two bottles of water.  You heat up bottle A and let it cool to room temperature.  B‘s been at room temperature all along.  You put ’em both in the fridge and track their temperatures.  A cools quicker.”

“That’s where I came in.”

“Both start at the same temperature, finish at the same temperature, and their Joules-per-second energy-shedding rates should be the same.  A cools in less time so A releases less heat.  Entropy change is released heat energy divided by temperature.  Somehow, bottle A went into the fridge with less entropy than B had.  Why?  We don’t really know.”

~~ Rich Olcott

  • – Thanks to Ilias Tirovolas, whose paper inspired this post.

Meanwhile, back at the office

Closing time.  Anne and I stroll from Al’s coffee shop back to the Acme Building.  It’s a clear night with at least 4,500 stars, but Anne’s looking at the velvet black between them.

“What you said, Sy, about the Universe not obeying Conservation of Energy — tell me more about that.”

“Aaa-hmmm … OK.  You’ve heard about the Universe expanding, right?”

“Ye-es, but I don’t know why that happens.”

“Neither do the scientists, but there’s pretty firm evidence that it’s happening, if only at the longest scales.  Stars within galaxies get closer together as they radiate away their gravitational energy.  But the galaxies themselves are getting further apart, as far out as we can measure.”

“What’s that got to do with Conservation of Energy?”

“Well, galaxies have mass so they should be drawn together by gravity the way that gravity pulls stars together inside galaxies.  But that’s not what’s happening.  Something’s actively pushing galaxies or galaxy clusters away from each other.  Giving the something a name like ‘dark energy‘ is just an accounting gimmick to pretend the First Law is still in effect at very large distances — we don’t know the energy source for the pushing, or even if there is one.  There’s a separate set of observations we attribute to a ‘dark energy‘ that may or may not have the same underlying cause.  That’s what I was talking about.”Fading white satin

We’re at the Acme Building.  I flash my badge to get us past Security and into the elevator.  As I reach out to press the ’12’ button she puts her hand on my arm.  “Sy, I want to see if I understand this entropy-elephant thing.  You said entropy started as an accounting gimmick, to help engineers keep track of fuel energy escaping into the surroundings.  Energy absorbed at one temperature they called the environment’s heat capacity.  Total energy absorbed over a range of temperatures, divided by the difference in temperature, they called change in entropy.”

The elevator lets us out on my floor and we walk to door 1217.  “You’ve got it right so far, Anne.  Then what?”

“Then the chemists realized that you can predict how lots of systems will work from only knowing a certain set of properties for the beginning and end states.  Pressure, volume, chemical composition, whatever, but also entropy.  But except for simple gases they couldn’t predict heat capacity or entropy, only measure it.”

My key lets us in.  She leans back against the door frame.  “That’s where your physicists come in, Sy.  They learned that heat in a substance is actually the kinetic energy of its molecules.  Gas molecules can move around, but that motion’s constrained in liquids and even more constrained in solids.  Going from solid to liquid and from liquid to gas absorbs heat energy in breaking those constraints.  That absorbed heat appears as increased entropy.”

She’s lounging against my filing cabinet.  “The other way that substances absorb heat is for parts of molecules to rotate and vibrate relative to other parts.  But there are levels.  Some vibrations excite easier than others, and many rotations are even easier.  In a cold material only some motions are active.  Rising temperature puts more kinds of motion into play.  Heat energy spreads across more and more sub-molecular absorbers.”

She’s perched on the edge of my desk.  “Here’s where entropy as possibility-counting shows up.  More heat, more possibilities, more entropy.  Now we can do arithmetic and prediction instead of measuring.  Anything you can count possibilities for you can think about defining an entropy for, like information bits or black holes or socks.  But it’ll be a different entropy, with its own rules and its own range of validity.  … And…”Riding the Elephant

She’s looming directly over me.  Her dark eyes are huge.

“And…?”

When we first met, Sy, you asked what you could do for me.  You’ve helped me see that when I travel across time and probability I’m riding the Entropy Elephant.  I’d like to show my appreciation.  Can you think of a possibility?”

A dark night, in a city that knows how to keep its secrets.  On the 12th floor of the Acme Building, one man still tries to answer the Universe’s persistent questions — Sy Moire, Physics Eye.

~~ Rich Olcott

Thoughts of Chair-man Moire

My apples and orange peels question, Sy,  isn’t that the same as Jeremy’s?  What’s the connection between heat capacity and counting?”

“You’re right, Anne.  Hmm.  Say, Al, all your coffee shop tables came with four chairs apiece, right?”

“Yup, four-tops every one, even in the back room.”

“You neaten them all up, four to a table, in the morning?”

“The night before.  There’s never time in the morning, customers demand coffee first thing.”

“But look, we’ve got six people seated at this table.  Where’d the extra chairs come from?”

“Other tables, of course.  Is this going somewhere?”

“Almost there.  So in fact the state of the room at any time will have some random distribution of chairs to tables.  You know on the average there’ll be four at a table, but you don’t know the actual distribution until you look, right?”

“Hey, we’re counting again.  You’re gonna say that’s about entropy ’cause the difference between four at a table and some other number is all random and there’s some formula to calculate entropy from that.”elephants and chairs

“True, Vinnie, but we’re about to take the next step.  How did these chairs wind up around this table?”

“We pulled them over, Mr. Moire.”

“My point is, Jeremy, we spent energy to get them here.  The more chairs that are out of position — ”

“The higher the entropy, but also the more energy went into the chairs.  It’s like that heat capacity thing we started with, the energy that got absorbed rather than driving the steam engine.”

“Awright, Anne!” from Jeremy <Jennie bristles a bit>, “and if all the chairs are in Al’s overnight position it’s like absolute zero.  Hey, temperature is average kinetic energy per particle so can we say that the more often a chair gets moved it’s like hotter?”

Jennie breaks in.  “Not a bit of it, Jeremy!  The whole metaphor’s daft.  We know temperature change times heat capacity equals the energy absorbed, right, and we’ve got a link between energy absorption and entropy, right, but what about if at the end of the day all the chairs accidentally wind up four at a table?  Entropy change is zero, right, but customers expended energy moving chairs about all day and Al’s got naught to set straight.”

“Science in action, I love it!  Anne and Jeremy, you two just bridged a gap it took Science a century to get across.  Carnot started us on entropy’s trail in 1824 but scientists in those days weren’t aware of matter’s atomic structure.  They knew that stuff can absorb heat but they had no inkling what did the absorbing or how that worked.  Thirty years later they understood simple gases better and figured out that average kinetic energy per particle bit.  But not until the 1920s did we have the quantum mechanics to show how parts of vibrating molecules can absorb heat energy stepwise like a table ‘absorbing’ chairs.  Only then could we do Vinnie’s state-counting to calculate entropies.”

“Yeah, more energy, spread across more steps, hiding more details we don’t know behind an average, more entropy.  But what about Jennie’s point?”

“Science is a stack of interconnected metaphors, Vinnie.  Some are better than others.  The trick is attending to the boundaries where they stop being valid.  Jennie’s absolutely correct that my four-chair argument is only a cartoon for illustrating stepwise energy accumulation.  If Al had a billion tables instead of a dozen or so, the odds on getting everything back to the zero state would disappear into rounding error.”

“How does black hole entropy play into this, Sy?”TSE classical vs BH

“Not very well, actually.  Oh, sure, the two systems have similar structures.  They’ve each got three inter-related central quantities constrained by three laws.  Here, I’ve charted them out on Old Reliable.”

“OK, their Second and Third Laws look pretty much the same, but their First Laws don’t match up.”

“Right, Al.  And even Bekenstein pointed out inconsistencies between classic thermodynamic temperature and what’s come to be called Hawking temperature.  Hawking didn’t agree.  The theoreticians are still arguing.  Here’s a funny one — if you dig deep enough, both versions of the First Law are the same, but the Universe doesn’t obey it.”

“That’s it, closing time.  Everybody out.”

~~ Rich Olcott

Taming The Elephant

Suddenly they were all on the attack.  Anne got in the first lick.  “C’mon, Sy, you’re comparing apples and orange peel.  Your hydrogen sphere would be on the inside of the black hole’s event horizon, and Jeremy’s virtual particles are on the outside.”

[If you’ve not read my prior post, do that now and this’ll make more sense.  Go ahead, I’ll wait here.]white satin and 5 elephantsJennie’s turn — “Didn’t the chemists define away a whole lot of entropy when they said that pure elements have zero entropy at absolute zero temperature?”

Then Vinnie took a shot.  “If you’re counting maybe-particles per square whatever for the surface, shouldn’t you oughta count maybe-atoms or something per cubic whatever for the sphere?”

Jeremy posed the deepest questions. “But Mr Moire, aren’t those two different definitions for entropy?  What does heat capacity have to do with counting, anyhow?”

Al brought over mugs of coffee and a plate of scones.  “This I gotta hear.”

“Whew, but this is good ’cause we’re getting down to the nub.  First to Jennie’s point — Under the covers, Hawking’s evaluation is just as arbitrary as the chemists’.  Vinnie’s ‘whatever’ is the Planck length, lP=1.616×10-35 meter.  It’s the square root of such a simple combination of fundamental constants that many physicists think that lP2=2.611×10-70 m², is the ‘quantum of area.’  But that’s just a convenient assumption with no supporting evidence behind it.”

“Ah, so Hawking’s ABH=4πrs2 and SBH=ABH/4 formulation with rs measured in Planck-lengths, just counts the number of area-quanta on the event horizon’s surface.”

“Exactly, Jennie.  If there really is a least possible area, which a lot of physicists doubt, and if its size doesn’t happen to equal lP2, then the black hole entropy gets recalculated to match.”

“So what’s wrong with cubic those-things?”

“Nothing, Vinnie, except that volumes measured in lP3 don’t apply to a black hole because the interior’s really four-dimensional with time scrambled into the distance formulas.  Besides, Hawking proved that the entropy varies with half-diameter squared, not half-diameter cubed.”

“But you could still measure your hydrogen sphere with them and that’d get rid of that 1033 discrepancy between the two entropies.”

“Not really, Vinnie.  Old Reliable calculated solid hydrogen’s entropy for a certain mass, not a volume.”

“Hawking can make his arbitrary choice, Sy, he’s Hawking, but that doesn’t let the chemists off the scaffold.  How did they get away with arbitrarily defining a zero for entropy?”

“Because it worked, Jennie.  They were only concerned with changes — the difference between a system’s state at the end of a process, versus its state at the beginning.  It was only the entropy difference that counted, not its absolute value.”

“Hey, like altitude differences in potential energy.”

“Absolutely, Vinnie, and that’ll be important when we get to Jeremy’s question.  So, Jennie, if you’re only interested in chemical reactions and if it’s still in the 19th Century and the world doesn’t know about isotopes yet, is there a problem with defining zero entropy to be at a convenient set of conditions?”

“Well, but Vinnie’s Second Law says you can never get down to absolute zero so that’s not convenient.”

“Good point, but the Ideal Gas Law and other tools let scientists extrapolate experimentally measured properties down to extremely low temperatures.  In fact, the very notion of absolute zero temperature came from experiments where the volume of a  hydrogen or helium gas sample appears to decrease linearly towards zero at that temperature, at least until the sample condenses to a liquid.  With properly calibrated thermometers, physical chemists knocked themselves out measuring heat capacities and entropies at different temperatures for every substance they could lay hands on.”

“What about isotopes, Mr Moire?  Isn’t chlorine’s atomic weight something-and-a-half so there’s gotta be several of kinds of chlorine atoms so any sample you’ve got is a mixture and that’s random and that has to have a non-zero entropy even at absolute zero.”

“It’s 35.4, two stable isotopes, Jeremy, but we know how to account for entropy of mixing and anyway, the isotope mix rarely changes in chemical processes.”

“But my apples and orange peels, Sy — what does the entropy elephant do about them?”

~~ Rich Olcott