Up, Down And Between

Vinnie finishes his double‑pepperoni pizza. “Sy, these enthalpies got a pressure‑volume part and a temperature‑heat capacity part, but seems to me the most important part is the chemical energy.”

I’m still working on my slice (cheese and sausage). “That’s certainly true from a fuel engineering perspective, Vinnie. Here’s a clue. Check the values in this table for 0°C, also known as 273K.”

“Waitaminute! That line says the enthalpy’s exactly zero under the book‘s conditions. We talked about zeros a long time ago. All measurements have error. Nothing’s exactly zero unless it’s defined that way or it’s Absolute Zero temperature and we’ll never get there. Is this another definition thing?”

“More of a convenience thing. The altimeters in those planes you fly, do they display the distance to Earth’s center?”

“Nope, altitude above sea level, if they’re calibrated right.”

“But the other would work, too, say as a percentage of the average radius?”

“Not really. Earth’s fatter at the Equator than it is at the poles. You’d always have to correct for latitude. And the numbers would be clumsy, always some fraction of a percent of whatever the average is—”

“6371 kilometers.”

“Yeah, that. Try working with fractions of a part per thousand when you’re coming in through a thunderstorm. Give me kilometers or feet above sea level and I’m a lot happier.”

“But say you’re landing in Denver, 1.6 kilometers above sea level.”

“It’s a lot easier to subtract 1.6 from baseline altitude in kilometers than 0.00025 from 1.00something and getting the decimals right. Sea‑level calibrations are a lot easier to work with.”

“So now you know why the book shows zero enthalpy for water at 273K.”

“You’re saying there’s not really zero chemical energy in there, it’s just a convenient place to start counting?”

“That’s exactly what I’m saying. Chemical energy is just another form of potential energy. Zeroes on a potential scale are arbitrary. What’s important is the difference between initial and final states. Altitude’s about gravitational potential relative to the ground; chemists care about chemical potential relative to a specific reaction’s final products. Both concerns are about where you started and where you stop.”

“Gimme a chemical f’rinstance.”

<reading off of Old Reliable> “Reacting 1 gram of oxygen gas and 0.14 gram of hydrogen gas slowly in a catalytic fuel cell at 298K and atmospheric pressure produces one gram of liquid water and releases 18.1 kilojoules of energy. Exploding the same gas mix at the same pressure in a piston also yields 18.1 kilojoules once you cool everything back down to 298K. Different routes, same results.”

Meanwhile, Jeremy’s wandered over from his gelato stand. “Excuse me, Mr Moire. I read your Crazy Theory about how mammals like to keep their body temperature in the range near water’s minimum Specific Heat, um Heat Capacity, but now I’m confused.”

“What’s the confusion, Jeremy?”

“Well, what you told me before made sense, about increased temperature activates higher‑energy kinds of molecular waggling to absorb the heat. But that means that Heat Capacity always ought to increase with increasing temperature, right?”

“Good thinking. So your problem is…?

“Your graph shows that if water’s cold, warming it decreases its Heat Capacity. Do hotter water molecules waggle less?”

“No, it’s a context thing. Gas and liquid are different contexts. Each molecule in a gas is all by itself, most of the time, so its waggling is determined only by its internal bonding and mass configuration. Put that molecule into a liquid or solid, it’s subject to what its neighbors are doing. Water’s particularly good at intermolecular interactions. You know about the hexagonal structure locked into ice and snowflakes. When water ice melts but it’s still at low temperature, much of the hexagonal structure hangs around in a mushy state. A loose structure’s whole‑body quivering can absorb heat energy without exciting waggles in its constituent molecules. Raising the temperature disrupts that floppy structure. That’s most of the fall on the Heat Capacity curve.”

“Ah, then the Sensitivity decrease on the high‑temperature side has to do with blurry structure bits breaking down to tinier pieces that warm up more from less energy. Thanks, Mr Moire.”

“Don’t mention it.”

~~ Rich Olcott

Early Days in The Sunshine

“Wait, Sy. From what you just said about rocket fuel, its enthalpic energy content changes if I move it. On the ground it’s ‘chemical energy plus thermal plus Pressure times Volume.’ Up in space, though, the pressure part’s zero. So how come the CRC Handbook people decided it’s worthwhile to publish pages and pages of specific heat and enthalpy tables if it’s all ‘it depends’?”

“We know the dependencies, Vinnie. The numbers cover a wide temperature range but they’re all at atmospheric pressure. ‘Pressure times Volume‘ makes it easy to adjust for pressure change — just do that multiplication and add the result to the other terms. It’s trickier when the pressure varies between here and there but we’ve got math to handle that. The ‘thermal‘ part’s also not a big problem because if you something’s specific heat you know how its energy content changes with temperature change and vice‑versa.”

<checking a chart on his phone> “This says water’s specific heat number changes with temperature. They’re all about 1.0 but some are a little higher and some a little lower. Graph ’em out, looks like there’s a pattern there.”

<tapping on Old Reliable’s screen> “Good eye. High at the extreme temperatures, lower near — that’s interesting.”

“What’s that?”

“The range where the curve is flattest, 35 to 40°C. Sound familiar?”

“Yeah, my usual body temperature’s in there, toward the high side if I’ve got a fever. What’s that mean?”

“That’s so far out of my field all I’ve got is guesses. Hold on … there, I’ve added a line for 1/SH.”

“What’s that get you?”

“A different perspective. Specific Heat is the energy change when one gram of something changes temperature by one degree. This new line, I’ve called it Sensitivity, is how many degrees one unit of heat energy will warm the gram. Interesting that both curves flatten out in exactly the temperature range that mammals like us try to maintain. The question is, why do mammals prefer that range?”

“And your answer is?”

“A guess. Remember, I’m not a biologist or a biochemist and I haven’t studied how biomolecules interact with water.”

“I get that we should file this under Crazy Theories. Out with it.”

“Okay. Suppose it’s early days in mammalian evolution. You’re one of those early beasties. You’re not cold-blooded like a reptile, you’re equipped with a thermostat for your warm blood. Maybe you shiver if you’re cold, pant if you’re hot, doesn’t matter. What does matter is, your thermostat has a target temperature. Suppose your target’s on the graph’s coolish left side where water’s sensitivity rises rapidly. You’re sunning yourself on a flat rock, all parts of you getting the same calories per hour.”

“That’s on the sunward side. Shady side not so much.”

“Good point. I’ll get to that. On the sunward side you’re absorbing energy and getting warm, but the warmer you get the more your heat sensitivity rises. Near your target point your tissues warm up say 0.4 degree per unit of sunlight, but after some warming those tissues are heating by 0.6 degrees for the same energy input.”

“I recognize positive feedback when I see it, Sy. Every minute on that rock drives me further away from my target temperature. Whoa! But on the shady side I don’t have that problem.”

“That’s even messier. You’ve got a temperature disparity between the two sides and it’s increasing. Can your primitive circulatory system handle that? Suppose you’re smart enough to scurry out of the sunlight. You’ve still got a problem. There’s more to you than your skin. You’ve got muscles and those muscles have cells and those cells do biochemistry. Every chemical reaction inside you gives off at least a little heat for more positive feedback.”

“What if my thermostat’s set over there on the hot side?”

“You’d be happy in the daytime but you’d have a problem at night. For every degree you chill below comfortable, you need to generate a greater amount of energy to get back up to your target setting.”

“Smart of evolution to set my thermostat where water’s specific heat changes least with temperature.”

“That’s my guess.”

~~ Rich Olcott

Two Against One, And It’s Not Even Close


On a brisk walk across campus when I hear Vinnie yell from Al’s coffee shop. “Hey! Sy! Me and Al got this argument going you gotta settle.”

“Happy to be a peacemaker, but it’ll cost you a mug of Al’s coffee and a strawberry scone.”

“Coffee’s no charge, Sy, but the scone goes on Vinnie’s tab. What’s your pleasure?”

“It’s morning, Al, time for black mud. What’s the argument, Vinnie?”

“Al read in one of his astronomy magazines that the Moon’s drifting away from us. Is that true, and if it is, how’s it happen? Al thinks Jupiter’s gravity’s lifting it but I think it’s because of Solar winds pushing it. So which is it?”

“Here you go, Sy, straight from the bottom of the pot.”

“Perfect, Al, thanks. Yes, it’s true. The drift rate is about 1¼ nanometers per second, 1½ inches per year. As to your argument, you’re both wrong.”

“Huh?”
 ”Aw, c’mon!”

“Al, let’s put some numbers to your hypothesis. <pulling out Old Reliable and screen‑tapping> I’m going to compare Jupiter’s pull on the Moon to Earth’s when the two planets are closest together. OK?”

“I suppose.”

“Alright. Newton’s Law tells us the pull is proportional to the mass. Jupiter’s mass is about 320 times Earth, which is pretty impressive, right? But the attraction drops with the square of the distance. The Moon is 1¼ lightseconds from Earth. At closest approach, Jupiter is almost 2100 lightseconds away, 1680 times further than the Moon. We need to divide the 320 mass factor by a 1680‑squared distance factor and that makes <key taps> Jupiter’s pull on the Moon is only 0.011 percent of Earth’s. It’ll be <taps> half that when Jupiter’s on the other side of the Sun. Not much competition, eh?”

“Yeah, but a little bit at a time, it adds up.”

“We’re not done yet. The Moon feels the big guy’s pull on both sides of its orbit around Earth. On the side where the Moon’s moving away from Jupiter, you’re right, Jupiter’s gravity slows the Moon down, a little. But on the moving-toward-Jupiter side, the motion’s sped up. Put it all together, Jupiter’s teeny pull cancels itself out over every month’s orbiting.”

“Gotcha, Al. So what about my theory, Sy?”

“Basically the same logic, Vinnie. The Solar wind varies, thanks to the Sun’s variable activity, but satellite measurements put its pressure somewhere around a nanopascal, a nanonewton per square meter. Multiply that by the Moon’s cross‑sectional area and we get <tap, tap> a bit less than ten thousand newtons of force on the Moon. Meanwhile, Newton’s Law says the Earth’s pull on the Moon comes to <tapping>
  G×(Earth’s mass)×(Moon’s mass)/(Earth-Moon distance)²
and that comes to 2×1011 newtons. Earth wins by a 107‑fold landslide. Anyway, the pressure slows the Moon for only half of each month and speeds it up the other half so we’ve got another cancellation going on.”

“So what is it then?”
 ”So what is it then?”

“Tides. Not just ocean tides, rock tides in Earth’s fluid outer mantle. Earth bulges, just a bit, toward the Moon. But Earth also rotates, so the bulge circles the planet every day.”

“Reminds me of the wave in the Interstellar movie, but why don’t we see it?”

“The movie’s wave was hundreds of times higher than ours, Al. It was water, not rock, and the wave‑raiser was a huge black hole close by the planet. The Moon’s tidal pull on Earth produces only a one‑meter variation on a 6,400,000‑meter radius. Not a big deal to us. Of course, it makes a lot of difference to the material that’s being kneaded up and down. There’s a lot of friction in those layers.”

“Friction makes heat, Sy. Rock tides oughta heat up the planet, right?”

“Sure, Vinnie, the process does generate heat. Force times distance equals energy. Raising the Moon by 1¼ nanometers per second against a force of 2×1021 newtons gives us <taping furiously> an energy transfer rate of 4×10‑23 joules per second per kilogram of Earth’s 6×1024‑kilogram mass. It takes about a thousand joules to heat a kilogram of rock by one kelvin so we’re looking at a temperature rise near 10‑27 kelvins per second. Not significant.”

“No blaming climate change on the Moon, huh?”

~~ Rich Olcott

Free Energy, or Not

From: Richard Feder <rmfeder@fortleenj.com>
To: Sy Moire <sy@moirestudies.com>
Subj: Questions

What’s this about “free energy”? Is that energy that’s free to move around anywhere? Or maybe the vacuum energy that this guy said is in the vacuum of space that will transform the earth into a wonderful world of everything for free for everybody forever once we figure out how to handle the force fields and pull energy out of them?


From: Sy Moire <sy@moirestudies.com>
To: Richard Feder <rmfeder@fortleenj.com>

Subj: Re: Questions

Well, Mr Feder, as usual you have a lot of questions all rolled up together. I’ll try to take one at a time.

It’s clear you already know that to make something happen you need energy. Not a very substantial definition, but then energy is an abstract thing it took humanity a couple of hundred years to get our minds around and we’re still learning.

Physics has several more formal definitions for “energy,” all clustered around the ability to exert force to move something and/or heat something up. The “and/or” is the kicker, because it turns out you can’t do just the moving. As one statement of the Second Law of Thermodynamics puts it, “There are no perfectly efficient processes.”

For example, when your car’s engine burns a few drops of gasoline in the cylinder, the liquid becomes a 22000‑times larger volume of hot gas that pushes the piston down in its power stroke to move the car forward. In the process, though, the engine heats up (wasted energy), gases exiting the cylinder are much hotter than air temperature (more wasted energy) and there’s friction‑generated heat all through the drive train (even more waste). Improving the drive train’s lubrication can reduce friction, but there’s no way to stop energy loss into heated-up combustion product molecules.

Two hundred years of effort haven’t uncovered a usable loophole in the Second Law. However, we have been able to quantify it. Especially for practically important chemical reactions, like burning gasoline, scientists can calculate how much energy the reaction product molecules will retain as heat. The energy available to do work is what’s left.

For historical reasons, the “available to do work” part is called “free energy.” Not free like running about like ball lightning, but free in the sense of not being bound up in jiggling heated‑up molecules.

Vacuum energy is just the opposite of free — it’s bound up in the structure of space itself. We’ve known for a century that atoms waggle back and forth within their molecules. Those vibrations give rise to the infrared spectra we use for remote temperature sensing and for studying planetary atmospheres. One of the basic results of quantum mechanics is that there’s a minimum amount of motion, called zero‑point vibration, that would persist even if the molecule were frozen to absolute zero temperature.

There are other kinds of zero‑point motion. We know of two phenomena, the Casimir effect and the Lamb shift, that can be explained by assuming that the electric field and other force fields “vibrate” at the ultramicroscopic scale even in the absence of matter. Not vibrations like going up and down, but like getting more and less intense. It’s possible that the same “vibrations” spark radioactive decay and some kinds of light emission.

Visualize space being marked off with a mesh of cubes. In each cube one or more fields more‑or‑less periodically intensify and then relax. The variation strength and timing are unpredictable. Neighboring squares may or may not sync up and that’s unpredictable, too.

The activity is all governed by yet another Heisenberg’s Uncertainty Principle trade‑off. The stronger the intensification, the less certain we can be about when or where the next one will happen.

What we can say is that whether you look at a large volume of space (even an atom is ultramicroscopicly huge) or a long period of time (a second might as well be a millennium), on the average the intensity is zero. All our energy‑using techniques involve channeling energy from a high‑potential source to a low‑potential sink. Vacuum energy sources are everywhere but so are the sinks and they all flit around. Catching lightning in a jar was easy by comparison.

Regards,
Sy Moire.

~~ Rich Olcott

The Battle of The Entropies

(the coffee-shop saga continues)  “Wait on, Sy, a black hole is a hollow sphere?”

I hadn’t noticed her arrival but there was Jennie, standing by Vinnie’s table and eyeing Jeremy who was sill eyeing Anne in her white satin.white satin and 2 elephants“That’s not quite what I said, Jennie.  Old Reliable’s software and and I worked up a hollow-shell model and to my surprise it’s consistent with one of Stephen Hawking’s results.  That’s a long way from saying that’s what a black hole is.”

“But you said some physicists say that.  Have they aught to stand on?”

“Sort of.  It’s a perfect case of ‘depends on where you’re standing.'”

Vinnie looked up.  “It’s frames again, ain’t it?”

“With black holes it’s always frames, Vinnie.  Hey, Jeremy, is a black hole something you could stand on?”

“Nosir, we said the hole’s event horizon is like Earth’s orbit, just a mathematical marker.  Except for the gravity and  the  three  Perils  Jennie and you and me talked about, I’d slide right through without feeling anything weird, right?”

“Good memory and just so.  In your frame of reference there’s nothing special about that surface — you wouldn’t experience scale changes in space or time when you encounter it.  In other frames, though, it’s special.  Suppose we’re standing a thousand miles away from a solar-size black hole and Jeremy throws a clock and a yardstick into it.  What would we see?”

“This is where those space compression and time dilation effects happen, innit?”

“You bet, Jennie.  Do you remember the formula?”

“I wrote it in my daybook … Ah, here it is —Schwarzchild factorMy notes say D is the black hole’s diameter and d is another object’s distance from its center.  One second in the falling object’s frame would look like f seconds to us.  But one mile would look like 1/f miles.  The event horizon is where d equals the half-diameter and f goes infinite.  The formula only works where the object stays outside the horizon.”

“And as your clock approaches the horizon, Jeremy…?”

“You’ll see my clock go slower and slower until it sto —.  Oh.  Oh!  That’s why those physicists think all the infalling mass is at the horizon, the stuff falls towards it forever and never makes it through.”

“Exactly.”

“Hey, waitaminute!  If all that mass never gets inside, how’d the black hole get started in the first place?”

“That’s why it’s only some physicists, Vinnie.  The rest don’t think we understand the formation process well enough to make guesses in public.”

“Wait, that formula’s crazy, Sy.  If something ever does get to where d is less than D/2, then what’s inside the square root becomes negative.  A clock would show imaginary time and a yardstick would go imaginary, too.  What’s that about?”

“Good eye, Anne, but no worries, the derivation of that formula explicitly assumes a weak gravitational field.  That’s not what we’ve got inside or even close to the event horizon.”

“Mmm, OK, but I want to get back to the entropy elephant.  Does black hole entropy have any connection to the other kinds?”

Strutural, mostly.  The numbers certainly don’t play well together.  Here’s an example I ran up recently on Old Reliable.  Say we’ve got a black hole twice the mass of the Sun, and it’s at the Hawking temperature for its mass, 12 billionths of a Kelvin.  Just for grins, let’s say it’s made of solid hydrogen.  Old Reliable calculated two entropies for that thing, one based on classical thermodynamics and the other based on the Bekenstein-Hawking formulation.”Entropy calculations“Wow, Old Reliable looks up stuff and takes care of unit conversions automatically?”

“Slick, eh, Jeremy?  That calculation up top for Schem is classical chemical thermodynamics.  A pure sample of any element at absolute zero temperature is defined to have zero entropy.  Chemical entropy is cumulative heat capacity as the sample warms up.  The Hawking temperature is so close to zero I could treat heat capacity as a constant.

“In the middle section I calculated the object’s surface area in square Planck-lengths lP², and in the bottom section I used Hawking’s formula to convert area to B-H entropy, SBH.  They disagree by a factor of 1033.”

A moment of shocked silence, and then…

~~ Rich Olcott

Enter the Elephant, stage right

Anne?”

“Mm?”

“Remember when you said that other reality, the one without the letter ‘C,’  felt more probable than this one?”

“Mm-mm.”

“What tipped you off?”

Now you’re asking?”

“I’m a physicist, physicists think about stuff.  Besides, we’ve finished the pizza.”

<sigh> “This conversation has gotten pretty improbable, if you ask me.  Oh, well.  Umm, I guess it’s two things.  The more-probable realities feel denser somehow, and more jangly. What got you on this track?”

“Conservation of energy.  Einstein’s E=mc² says your mass embodies a considerable amount of energy, but when you jump out of this reality there’s no flash of light or heat, just that fizzing sound.  When you come back, no sudden chill or things falling down on us, just the same fizzing.  Your mass-energy that has to go to or come from somewhere.  I can’t think where or how.”

“I certainly don’t know, I just do it.  Do you have any physicist guesses?”

“Questions first.”

“If you must.”

“It’s what I do.  What do you perceive during a jump?  Maybe something like falling, or heat or cold?”

“There’s not much ‘during.’  It’s not like I go through a tunnel, it’s more like just turning around.  What I see goes out of focus briefly.  Mostly it’s the fizzy sound and I itch.”

“Itch.  Hmm…  The same itch every jump?”

“That’s interesting.  No, it’s not.  I itch more if I jump to a more-probable reality.”

Very interesting.  I’ll bet you don’t get that itch if you’re doing a pure time-hop.”

“You’re right!  OK, you’re onto something, give.”

“You’ve met one of my pet elephants.”

“Wha….??”White satin and elephant

“A deep question that physics has been nibbling around for almost two centuries.  Like the seven blind men and the elephant.  Except the physicists aren’t blind and the elephant’s pretty abstract.  Ready for a story?”

“Pour me another and I will be.”

“Here you go.  OK, it goes back to steam engines.  People were interested in getting as much work as possible out of each lump of coal they burned.  It took a couple of decades to develop good quantitative concepts of energy and work so they could grade coal in terms of energy per unit weight, but they got there.  Once they could quantify energy, they discovered that each material they measured — wood, metals, water, gases — had a consistent heat capacity.  It always took the same amount of energy to raise its temperature across a given range.  For a kilogram of water at 25°C, for instance, it takes one kilocalorie to raise its temperature to 26°C.  Lead and air take less.”

“So where’s the elephant come in?”

“I’m getting there.  We started out talking about steam engines, remember?  They work by letting steam under pressure push a piston through a cylinder.  While that’s happening, the steam cools down before it’s puffed out as that classic old-time Puffing Billy ‘CHUFF.’  Early engine designers thought the energy pushing the piston just came from trading off pressure for volume.  But a guy named Carnot essentially invented thermodynamics when he pointed out that the cooling-down was also important.  The temperature drop meant that heat energy stored in the steam must be contributing to the piston’s motion because there was no place else for it to go.”

“I want to hear about the elephant.”

“Almost there.  The question was, how to calculate the heat energy.”

“Why not just multiply the temperature change by the heat capacity?”

“That’d work if the heat capacity were temperature-independent, which it isn’t.  What we do is sum up the capacity at each intervening temperature.  Call the sum ‘elephant’ though it’s better known as Entropy.  Pressure, Volume, Temperature and Entropy define the state of a gas.  Using those state functions all you need to know is the working fluid’s initial and final state and you can calculate your engine.  Engineers and chemists do process design and experimental analysis using tables of reported state function values for different substances at different temperatures.”

“Do they know why heat capacity changes?”

“That took a long time to work out, which is part of why entropy’s an elephant.  And you’ve just encountered the elephant’s trunk.”

“There’s more elephant?”

“And more of this.  Want a refill?”

~~ Rich Olcott