# Two Against One, And It’s Not Even Close

On a brisk walk across campus when I hear Vinnie yell from Al’s coffee shop. “Hey! Sy! Me and Al got this argument going you gotta settle.”

“Happy to be a peacemaker, but it’ll cost you a mug of Al’s coffee and a strawberry scone.”

“Coffee’s no charge, Sy, but the scone goes on Vinnie’s tab. What’s your pleasure?”

“It’s morning, Al, time for black mud. What’s the argument, Vinnie?”

“Al read in one of his astronomy magazines that the Moon’s drifting away from us. Is that true, and if it is, how’s it happen? Al thinks Jupiter’s gravity’s lifting it but I think it’s because of Solar winds pushing it. So which is it?”

“Here you go, Sy, straight from the bottom of the pot.”

“Perfect, Al, thanks. Yes, it’s true. The drift rate is about 1¼ nanometers per second, 1½ inches per year. As to your argument, you’re both wrong.”

“Huh?”
”Aw, c’mon!”

“Al, let’s put some numbers to your hypothesis. <pulling out Old Reliable and screen‑tapping> I’m going to compare Jupiter’s pull on the Moon to Earth’s when the two planets are closest together. OK?”

“I suppose.”

“Alright. Newton’s Law tells us the pull is proportional to the mass. Jupiter’s mass is about 320 times Earth, which is pretty impressive, right? But the attraction drops with the square of the distance. The Moon is 1¼ lightseconds from Earth. At closest approach, Jupiter is almost 2100 lightseconds away, 1680 times further than the Moon. We need to divide the 320 mass factor by a 1680‑squared distance factor and that makes <key taps> Jupiter’s pull on the Moon is only 0.011 percent of Earth’s. It’ll be <taps> half that when Jupiter’s on the other side of the Sun. Not much competition, eh?”

“Yeah, but a little bit at a time, it adds up.”

“We’re not done yet. The Moon feels the big guy’s pull on both sides of its orbit around Earth. On the side where the Moon’s moving away from Jupiter, you’re right, Jupiter’s gravity slows the Moon down, a little. But on the moving-toward-Jupiter side, the motion’s sped up. Put it all together, Jupiter’s teeny pull cancels itself out over every month’s orbiting.”

“Gotcha, Al. So what about my theory, Sy?”

“Basically the same logic, Vinnie. The Solar wind varies, thanks to the Sun’s variable activity, but satellite measurements put its pressure somewhere around a nanopascal, a nanonewton per square meter. Multiply that by the Moon’s cross‑sectional area and we get <tap, tap> a bit less than ten thousand newtons of force on the Moon. Meanwhile, Newton’s Law says the Earth’s pull on the Moon comes to <tapping>
G×(Earth’s mass)×(Moon’s mass)/(Earth-Moon distance)²
and that comes to 2×1011 newtons. Earth wins by a 107‑fold landslide. Anyway, the pressure slows the Moon for only half of each month and speeds it up the other half so we’ve got another cancellation going on.”

“So what is it then?”
”So what is it then?”

“Tides. Not just ocean tides, rock tides in Earth’s fluid outer mantle. Earth bulges, just a bit, toward the Moon. But Earth also rotates, so the bulge circles the planet every day.”

“Reminds me of the wave in the Interstellar movie, but why don’t we see it?”

“The movie’s wave was hundreds of times higher than ours, Al. It was water, not rock, and the wave‑raiser was a huge black hole close by the planet. The Moon’s tidal pull on Earth produces only a one‑meter variation on a 6,400,000‑meter radius. Not a big deal to us. Of course, it makes a lot of difference to the material that’s being kneaded up and down. There’s a lot of friction in those layers.”

“Friction makes heat, Sy. Rock tides oughta heat up the planet, right?”

“Sure, Vinnie, the process does generate heat. Force times distance equals energy. Raising the Moon by 1¼ nanometers per second against a force of 2×1021 newtons gives us <taping furiously> an energy transfer rate of 4×10‑23 joules per second per kilogram of Earth’s 6×1024‑kilogram mass. It takes about a thousand joules to heat a kilogram of rock by one kelvin so we’re looking at a temperature rise near 10‑27 kelvins per second. Not significant.”

“No blaming climate change on the Moon, huh?”

~~ Rich Olcott

# Free Energy, or Not

From: Richard Feder <rmfeder@fortleenj.com>
To: Sy Moire <sy@moirestudies.com>
Subj: Questions

What’s this about “free energy”? Is that energy that’s free to move around anywhere? Or maybe the vacuum energy that this guy said is in the vacuum of space that will transform the earth into a wonderful world of everything for free for everybody forever once we figure out how to handle the force fields and pull energy out of them?

From: Sy Moire <sy@moirestudies.com>
To: Richard Feder <rmfeder@fortleenj.com>

Subj: Re: Questions

Well, Mr Feder, as usual you have a lot of questions all rolled up together. I’ll try to take one at a time.

It’s clear you already know that to make something happen you need energy. Not a very substantial definition, but then energy is an abstract thing it took humanity a couple of hundred years to get our minds around and we’re still learning.

Physics has several more formal definitions for “energy,” all clustered around the ability to exert force to move something and/or heat something up. The “and/or” is the kicker, because it turns out you can’t do just the moving. As one statement of the Second Law of Thermodynamics puts it, “There are no perfectly efficient processes.”

For example, when your car’s engine burns a few drops of gasoline in the cylinder, the liquid becomes a 22000‑times larger volume of hot gas that pushes the piston down in its power stroke to move the car forward. In the process, though, the engine heats up (wasted energy), gases exiting the cylinder are much hotter than air temperature (more wasted energy) and there’s friction‑generated heat all through the drive train (even more waste). Improving the drive train’s lubrication can reduce friction, but there’s no way to stop energy loss into heated-up combustion product molecules.

Two hundred years of effort haven’t uncovered a usable loophole in the Second Law. However, we have been able to quantify it. Especially for practically important chemical reactions, like burning gasoline, scientists can calculate how much energy the reaction product molecules will retain as heat. The energy available to do work is what’s left.

For historical reasons, the “available to do work” part is called “free energy.” Not free like running about like ball lightning, but free in the sense of not being bound up in jiggling heated‑up molecules.

Vacuum energy is just the opposite of free — it’s bound up in the structure of space itself. We’ve known for a century that atoms waggle back and forth within their molecules. Those vibrations give rise to the infrared spectra we use for remote temperature sensing and for studying planetary atmospheres. One of the basic results of quantum mechanics is that there’s a minimum amount of motion, called zero‑point vibration, that would persist even if the molecule were frozen to absolute zero temperature.

There are other kinds of zero‑point motion. We know of two phenomena, the Casimir effect and the Lamb shift, that can be explained by assuming that the electric field and other force fields “vibrate” at the ultramicroscopic scale even in the absence of matter. Not vibrations like going up and down, but like getting more and less intense. It’s possible that the same “vibrations” spark radioactive decay and some kinds of light emission.

Visualize space being marked off with a mesh of cubes. In each cube one or more fields more‑or‑less periodically intensify and then relax. The variation strength and timing are unpredictable. Neighboring squares may or may not sync up and that’s unpredictable, too.

The activity is all governed by yet another Heisenberg’s Uncertainty Principle trade‑off. The stronger the intensification, the less certain we can be about when or where the next one will happen.

What we can say is that whether you look at a large volume of space (even an atom is ultramicroscopicly huge) or a long period of time (a second might as well be a millennium), on the average the intensity is zero. All our energy‑using techniques involve channeling energy from a high‑potential source to a low‑potential sink. Vacuum energy sources are everywhere but so are the sinks and they all flit around. Catching lightning in a jar was easy by comparison.

Regards,
Sy Moire.

~~ Rich Olcott

# The Battle of The Entropies

(the coffee-shop saga continues)  “Wait on, Sy, a black hole is a hollow sphere?”

I hadn’t noticed her arrival but there was Jennie, standing by Vinnie’s table and eyeing Jeremy who was sill eyeing Anne in her white satin.“That’s not quite what I said, Jennie.  Old Reliable’s software and and I worked up a hollow-shell model and to my surprise it’s consistent with one of Stephen Hawking’s results.  That’s a long way from saying that’s what a black hole is.”

“But you said some physicists say that.  Have they aught to stand on?”

“Sort of.  It’s a perfect case of ‘depends on where you’re standing.'”

Vinnie looked up.  “It’s frames again, ain’t it?”

“With black holes it’s always frames, Vinnie.  Hey, Jeremy, is a black hole something you could stand on?”

“Nosir, we said the hole’s event horizon is like Earth’s orbit, just a mathematical marker.  Except for the gravity and  the  three  Perils  Jennie and you and me talked about, I’d slide right through without feeling anything weird, right?”

“Good memory and just so.  In your frame of reference there’s nothing special about that surface — you wouldn’t experience scale changes in space or time when you encounter it.  In other frames, though, it’s special.  Suppose we’re standing a thousand miles away from a solar-size black hole and Jeremy throws a clock and a yardstick into it.  What would we see?”

“This is where those space compression and time dilation effects happen, innit?”

“You bet, Jennie.  Do you remember the formula?”

“I wrote it in my daybook … Ah, here it is —My notes say D is the black hole’s diameter and d is another object’s distance from its center.  One second in the falling object’s frame would look like f seconds to us.  But one mile would look like 1/f miles.  The event horizon is where d equals the half-diameter and f goes infinite.  The formula only works where the object stays outside the horizon.”

“And as your clock approaches the horizon, Jeremy…?”

“You’ll see my clock go slower and slower until it sto —.  Oh.  Oh!  That’s why those physicists think all the infalling mass is at the horizon, the stuff falls towards it forever and never makes it through.”

“Exactly.”

“Hey, waitaminute!  If all that mass never gets inside, how’d the black hole get started in the first place?”

“That’s why it’s only some physicists, Vinnie.  The rest don’t think we understand the formation process well enough to make guesses in public.”

“Wait, that formula’s crazy, Sy.  If something ever does get to where d is less than D/2, then what’s inside the square root becomes negative.  A clock would show imaginary time and a yardstick would go imaginary, too.  What’s that about?”

“Good eye, Anne, but no worries, the derivation of that formula explicitly assumes a weak gravitational field.  That’s not what we’ve got inside or even close to the event horizon.”

“Mmm, OK, but I want to get back to the entropy elephant.  Does black hole entropy have any connection to the other kinds?”

Strutural, mostly.  The numbers certainly don’t play well together.  Here’s an example I ran up recently on Old Reliable.  Say we’ve got a black hole twice the mass of the Sun, and it’s at the Hawking temperature for its mass, 12 billionths of a Kelvin.  Just for grins, let’s say it’s made of solid hydrogen.  Old Reliable calculated two entropies for that thing, one based on classical thermodynamics and the other based on the Bekenstein-Hawking formulation.”“Wow, Old Reliable looks up stuff and takes care of unit conversions automatically?”

“Slick, eh, Jeremy?  That calculation up top for Schem is classical chemical thermodynamics.  A pure sample of any element at absolute zero temperature is defined to have zero entropy.  Chemical entropy is cumulative heat capacity as the sample warms up.  The Hawking temperature is so close to zero I could treat heat capacity as a constant.

“In the middle section I calculated the object’s surface area in square Planck-lengths lP², and in the bottom section I used Hawking’s formula to convert area to B-H entropy, SBH.  They disagree by a factor of 1033.”

A moment of shocked silence, and then…

~~ Rich Olcott

# Enter the Elephant, stage right

Anne?”

“Mm?”

“Remember when you said that other reality, the one without the letter ‘C,’  felt more probable than this one?”

“Mm-mm.”

“What tipped you off?”

Now you’re asking?”

“I’m a physicist, physicists think about stuff.  Besides, we’ve finished the pizza.”

<sigh> “This conversation has gotten pretty improbable, if you ask me.  Oh, well.  Umm, I guess it’s two things.  The more-probable realities feel denser somehow, and more jangly. What got you on this track?”

“Conservation of energy.  Einstein’s E=mc² says your mass embodies a considerable amount of energy, but when you jump out of this reality there’s no flash of light or heat, just that fizzing sound.  When you come back, no sudden chill or things falling down on us, just the same fizzing.  Your mass-energy that has to go to or come from somewhere.  I can’t think where or how.”

“I certainly don’t know, I just do it.  Do you have any physicist guesses?”

“Questions first.”

“If you must.”

“It’s what I do.  What do you perceive during a jump?  Maybe something like falling, or heat or cold?”

“There’s not much ‘during.’  It’s not like I go through a tunnel, it’s more like just turning around.  What I see goes out of focus briefly.  Mostly it’s the fizzy sound and I itch.”

“Itch.  Hmm…  The same itch every jump?”

“That’s interesting.  No, it’s not.  I itch more if I jump to a more-probable reality.”

Very interesting.  I’ll bet you don’t get that itch if you’re doing a pure time-hop.”

“You’re right!  OK, you’re onto something, give.”

“You’ve met one of my pet elephants.”

“Wha….??”

“A deep question that physics has been nibbling around for almost two centuries.  Like the seven blind men and the elephant.  Except the physicists aren’t blind and the elephant’s pretty abstract.  Ready for a story?”

“Pour me another and I will be.”

“Here you go.  OK, it goes back to steam engines.  People were interested in getting as much work as possible out of each lump of coal they burned.  It took a couple of decades to develop good quantitative concepts of energy and work so they could grade coal in terms of energy per unit weight, but they got there.  Once they could quantify energy, they discovered that each material they measured — wood, metals, water, gases — had a consistent heat capacity.  It always took the same amount of energy to raise its temperature across a given range.  For a kilogram of water at 25°C, for instance, it takes one kilocalorie to raise its temperature to 26°C.  Lead and air take less.”

“So where’s the elephant come in?”

“I’m getting there.  We started out talking about steam engines, remember?  They work by letting steam under pressure push a piston through a cylinder.  While that’s happening, the steam cools down before it’s puffed out as that classic old-time Puffing Billy ‘CHUFF.’  Early engine designers thought the energy pushing the piston just came from trading off pressure for volume.  But a guy named Carnot essentially invented thermodynamics when he pointed out that the cooling-down was also important.  The temperature drop meant that heat energy stored in the steam must be contributing to the piston’s motion because there was no place else for it to go.”

“I want to hear about the elephant.”

“Almost there.  The question was, how to calculate the heat energy.”

“Why not just multiply the temperature change by the heat capacity?”

“That’d work if the heat capacity were temperature-independent, which it isn’t.  What we do is sum up the capacity at each intervening temperature.  Call the sum ‘elephant’ though it’s better known as Entropy.  Pressure, Volume, Temperature and Entropy define the state of a gas.  Using those state functions all you need to know is the working fluid’s initial and final state and you can calculate your engine.  Engineers and chemists do process design and experimental analysis using tables of reported state function values for different substances at different temperatures.”

“Do they know why heat capacity changes?”

“That took a long time to work out, which is part of why entropy’s an elephant.  And you’ve just encountered the elephant’s trunk.”

“There’s more elephant?”

“And more of this.  Want a refill?”

~~ Rich Olcott