# Baseball And The Virtual Particle

Al was pouring my mugful of his morning blend (“If it doesn’t wake you up we’ll call the doctor“) when Jeremy stepped into the counter.  “Hi, Mr Moire.  I’m still trying to get my head around that virtual particle thing.  Hi, Al, a large decaf, please, double sugar, three creamers.  It looks like the shorter amount of time you give a particle to happen, the bigger it can get, but that doesn’t make sense because I’d think the longer you wait the more likely it’s gonna happen.  Thanks, Al.”

“Take a breath to blow on that coffee, Jeremy, or you’ll burn your tongue.  Hmm…  Word is your batting average is running about 250 these days.  That right?”

“Yessir.  I didn’t know you’re keeping track.”

“Keeping my ears open is part of my job.  So you’re hitting about once every four at-bats.  That gives Coach an estimate of when you’ll get your next hit.  What’s your slugging average?”

“What’s a slugging average?”

“Your total number of batted-on bases, divided by your at-bats, times a thousand ’cause sports writers don’t do decimal points.  You get one count in the numerator for a single, two for a double and so on.”

“Lemme think.  If I’m doing 250 overall and about half are singles and the other half are doubles that’d give me an SA of … about 375.”

“Pretty good.  So does that number tell Coach anything about when to expect another double?”

“Mmm, no, but what does that have to do with my virtual particle question?”

“In each case you’ve got a pair of statistics that tell you some things and hide other things.  Batting averages and your wait-time notion are about when to expect an event of some sort to occur.  You could hit another single or you could tag a homer — all Coach knows is that you should be able to get on base about once every four at-bats.”

“They’re the flip side, sort of.  You could think of the SA as batting potential.  If you hit homers all the time your SA would be 4000.  If you whiff every pitch your SA would be zero.  Anything between those extremes tells Coach something about your productivity but nothing about when you’re going to produce.  Energy uncertainty works the same way for virtual particles.  If you’re doing long-duration energy evaluations you can be pretty sure that any single measurement will be close to the long-term average.  You might possibly see a significant deviation from that average but only if you check just the right brief interval.”

“And for the particles in that empty space?”

“If you’re looking long-term, no particles.  That’s what ’empty’ means.  When there’s definitely nothing in a volume of space it makes sense to say its energy is zero because particles have mass and therefore embody energy.  But a particle might show up and go away after a very brief interval without significantly affecting that long-term average.  Quantum theory doesn’t say it will show up, just that it might.”

“So does it?”

“Oh yes, in space, in the lab and in commerce.  One explanation for your cell phone’s NFC function hinges on virtual radio-frequency photons being exchanged between devices.”

“Wait.  If a virtual particle shows up in that empty space, then it’s not empty any more and its energy isn’t zero any more, is it?”

“You’ve just discovered one aspect of zero-point energy, the quantum prediction that every system, even empty space, contains a non-zero minimum amount of energy.  People have thought about tapping that energy to power perpetual motion machines.”

“That’d be cool — the ultimate renewable.”

“Wouldn’t it, though?  But no can do, for a couple of reasons.  Virtual particles, by their nature, are random phenomena.  You can’t depend upon what kind of particle might show up, or when, nor how long it might hang around.  It’s not like NFC where antennas generate the particles.  The other issue is that ‘minimum’ means minimum.  If you could pull energy out of that space you’d lower its energy content and drop it below the minimum…. What’s the grin about?”

“Just wondering how they’d score hitting a virtual ball that disappears before the fielder catches it.”

~~ Rich Olcott

# And now for some completely different dimensions

Terry Pratchett wrote that Knowledge = Power = Energy = Matter = Mass.  Physicists don’t agree because the units don’t match up.

Physicists check equations with a powerful technique called “Dimensional Analysis,” but it’s only theoretically related to the “travel in space and time” kinds of dimension we discussed earlier.

It all started with Newton’s mechanics, his study of how objects affect the motion of other objects.  His vocabulary list included words like force, momentum, velocity, acceleration, mass, …, all concepts that seem familiar to us but which Newton either originated or fundamentally re-defined. As time went on, other thinkers added more terms like power, energy and action.

They’re all linked mathematically by various equations, but also by three fundamental dimensions: length (L), time (T) and mass (M). (There are a few others, like electric charge and temperature, that apply to problems outside of mechanics proper.)

Velocity, for example.  (Strictly speaking, velocity is speed in a particular direction but here we’re just concerned with its magnitude.)   You can measure it in miles per hour or millimeters per second or parsecs per millennium — in each case it’s length per time.  Velocity’s dimension expression is L/T no matter what units you use.

Momentum is the product of mass and velocity.  A 6,000-lb Escalade SUV doing 60 miles an hour has twice the momentum of a 3,000-lb compact car traveling at the same speed.  (Insurance companies are well aware of that fact and charge accordingly.)  In terms of dimensions, momentum is M*(L/T) = ML/T.

Acceleration is how rapidly velocity changes — a car clocked at “zero to 60 in 6 seconds” accelerated an average of 10 miles per hour per second.  Time’s in the denominator twice (who cares what the units are?), so the dimensional expression for acceleration is L/T2.

Physicists and chemists and engineers pay attention to these dimensional expressions because they have to match up across an equal sign.  Everyone knows Einstein’s equation, E = mc2. The c is the velocity of light.  As a velocity its dimension expression is L/T.  Therefore, the expression for energy must be M*(L/T)2 = ML2/T2.  See how easy?

Now things get more interesting.  Newton’s original Second Law calculated force on an object by how rapidly its momentum changed: (ML/T)/T.  Later on (possibly influenced by his feud with Liebniz about who invented calculus), he changed that to mass times acceleration M*(L/T2).  Conceptually they’re different but dimensionally they’re identical — both expressions for force work out to ML/T2.

Something seductively similar seems to apply to Heisenberg’s Area.  As we’ve seen, it’s the product of uncertainties in position (L) and momentum (ML/T) so the Area’s dimension expression works out to L*(ML/T) = ML2/T.

There is another way to get the same dimension expression but things aren’t not as nice there as they look at first glance.  Action is given by the amount of energy expended in a given time interval, times the length of that interval.  If you take the product of energy and time the dimensions work out as (ML2/T2)*T = ML2/T, just like Heisenberg’s Area.

It’s so tempting to think that energy and time negotiate precision like position and momentum do.  But they don’t.  In quantum mechanics, time is a driver, not a result.  If you tell me when an event happens (the t-coordinate), I can maybe calculate its energy and such.  But if you tell me the energy, I can’t give you a time when it’ll happen.  The situation reminds me of geologists trying to predict an earthquake.  They’ve got lots of statistics on tremor size distribution and can even give you average time between tremors of a certain size, but when will the next one hit?  Lord only knows.

File the detailed reasoning under “Arcane” — in technicalese, there are operators for position, momentum and energy but there’s no operator for time.  If you’re curious, John Baez’s paper has all the details.  Be warned, it contains equations!

Trust me — if you’ve spent a couple of days going through a long derivation, totting up the dimensions on either side of equations along the way is a great technique for reassuring yourself that you probably didn’t do something stupid back at hour 14.  Or maybe to detect that you did.

~~ Rich Olcott

# The Universe and Werner H.

Heisenberg’s Area ( about 10-34 Joule-second) is small, one ten-millionth of the explosive action in a single molecule of TNT.  OK, that’s maybe important for sub-atomic physics, but it’s way too small to have any implications for anything bigger, right?  Well, it could be responsible for shaping our Universe.

Quick recap: The Heisenberg Uncertainty Principle (HUP) says that certain quantities (for instance, position and momentum) are linked in a remarkable way.  We can’t measure either of them perfectly accurately, but we can make repeated more-or-less sloppy measurements that give us average values.  The linkage is in that sloppiness.  Each repeated measurement lands somewhere in a range of values around the average.  HUP says that even with very careful measurement the product of those two spans must be greater than Heisenberg’s Area.

So now let’s head out to empty space, shall we?  I mean, really empty space, out there between the galaxies, where there’s only about one hydrogen atom per cubic meter.

Here’s a good cubic meter … sure enough, it’s got exactly one hydrogen atom in it.

For practice using Heisenberg’s Area, what can we say about the atom? (If you’re checking my math it’ll help to know that the Area, h/4π, can also be expressed as 0.5×10-34 kg m2/s; the mass of one hydrogen atom is 1.7×10-27 kg; and the speed of light is 3×108 m/s.)  On average the atom’s position is at the cube’s center.  Its position range is one meter wide.  Whatever the atom’s average momentum might be, our measurements would be somewhere within a momentum range of (h/4π kg m2/s) / (1 m) = 0.5×10-34 kg m/s. A moving particle’s momentum is its mass times its velocity, so the velocity range is (0.5×10-34 kg m/s) / (1.7×10-27 kg) = 0.3×10-7 m/s.

With really good tools we could determine the atom’s velocity within plus or minus 0.000 000 03 m/s.  Pretty good.

Now zoom in.  Dial that one-meter cube down a billion-fold to a nanometer (10-9 meters, which is still about ten times the atom’s width).  Yeah, the atom’s still in the box, but now its velocity range is 300 m/s.  The atom could be just hanging out at the center, or it could zoom out of the cube a microsecond after we looked — we just can’t tell which.

All of which illuminates the contrast between physics Newton-style and the physics that has bloomed since Einstein’s 1905 “miracle year.”  If Newton were in charge of the Universe, Heisenberg’s Area would be zero.  We could determine that atom’s position and momentum with complete accuracy.  In fact in principle we could accurately determine everything’s position and momentum and then calculate where everything would be at any time in the future.  But he isn’t and it’s not and we can’t.

Theorists and experimenters use the word “measurement” in different ways. A measurement done by a theoretician is generally based on fundamental constants and an elaborate mathematical structure. If the measurement is a quantum mechanical result, part of that structure is our familiar bell-shaped curve.  It’s an explicit recognition that way down in the world of the very small, we can’t know what’s really going on.  Most calculations have to be statistical, predicting an average and an expected range about that average. That prediction may or may not pan out, depending on what the experimentalists find.

By contrast, when experimenters measure something, even as an average of multiple tests, it’s an estimate of the real distribution.  The research group (usually it’s a group these days) reports a distribution that they claim overlaps well with a real one out there in the Universe.  Then another group dives in to prove they or the theoreticians or both are wrong.  That’s how Science works.

So there could be a collection of bell-curves gathered about the experimental result. Remember those extra dimensions we discussed earlier?  One theory that’s been floated is that along those extra dimensions the fundamental constants like h might take on different values.  Maybe further along “Dimension W” the value of h is bigger than it is in our Universe, and quantum effects are even more important than they are here.

Now how can we test that?

BTW, Heisenberg will be 114 on Dec 5.  Alles Gute zum Geburtstag, Werner!

~~ Rich Olcott

# Heisenberg’s Area

Unlike politicians, scientists want to know what they’re talking about when they use a technical word like  “Uncertainty.”  When Heisenberg laid out his Uncertainty Principle, he wasn’t talking about doubt.  He was talking about how closely experimental results can cluster together, and he was putting that in numbers.

Think of Robin Hood competing for the Golden Arrow.  For the showmanship of the thing, Robin wasn’t just trying to hit the target, he wanted his arrow to split the Sheriff’s.  If the Sheriff’s shot was in the second ring (moderate accuracy, from the target’s point of view), then Robin’s had to hit exactly the same off-center location (still moderate accuracy but great precision).  The Heisenberg Uncertainty Principle (HUP) is all about precision (a.k.a, range of variation).

We’ve all encountered exams that were graded “on the curve.”  But what curve is that?  I can say from personal experience that it’s extraordinarily difficult to create an exam where  the average grade is 75.  I want to give everyone the chance to show what they’ve learned.  Each student probably learned only part of what’s in the unit, but I won’t know which part until after the exam is graded.  The only way to be fair is to ask about everything in the unit.  Students complained that my tests were really hard because to get 100 they had to know it all.

Translating test scores to grades for a small class was straightforward.  I would plot how many papers got between 95 and 100, how many got 90-95, etc, and look at the graph.  Nearly always it looked like the top example.  There’s a few people who clearly have the material down pat; they clearly earned an “A.”  Then there’s a second group who didn’t do as well as the A’s but did significantly better than the rest of the class — they earned a “B.”  As the other end there’s a (hopefully small) group of students who are floundering.  Long-term I tried to give them extra help but short-term I had no choice but to give them an “F.”

With a large class those distinctions get blurred and all I saw (usually) was a single broad range of scores, the well-known “bell-shaped curve.”  If the test was easy the bell was centered around a high score.  If the test was hard that center was much lower.  What’s interesting, though, is that the width of that bell for a given class stayed pretty much the same.  The curve’s width is described by a number called the standard deviation (SD), proportional to the width at half-height.  If a student asked, “What’s my score?” I could look at the curve for that exam and say there’s a 66% chance that the score was within one SD of the average, and a 95% chance that it was within two SD’s.

The same bell-shape also shows up in research situations where a scientist wants to measure some real-world number, be it an asteroid’s weight or elephant gestation time.  He can’t know the true value, so instead he makes many replicate measurements or pays close attention to many pregnant elephants.  He summarizes his results by reporting the average of all the measurements and also the SD calculated from those measurements.  Just as for the exams, there’s a 95% chance that the true value is within two SD’s of the average.  The scientist would say that the SD represents the uncertainty of the measured average.

Which is what Heisenberg’s inequality is about.  He wrote that the product of two paired uncertainties (like position and momentum) must be larger than that teeny “quantum of action,” h.  There’s a trade-off.  We can refine our measurement of one variable but we’ll lose precision on the other.  If we plot results for one member of the pair against results for the other, there’s no linkage between their average values.  However, there will be a rectangle in the middle representing the combined uncertainty.

Heisenberg tells us that the minimum area of that rectangle is a constant.

It’s a very small rectangle, area = h/4π = 0.5×10-34 Joule-sec, but it’s significant on the scale of atoms — and maybe on the scale of the Universe (see next week).

~~ Rich Olcott

A kite floating on the breeze.  Optimal work-life balance.  Smoothly functioning free markets.  The Heisenberg Uncertainty Principle.  Why would an alien from another planet recognize the last one but maybe not the others?

The kite is a physical object, intentionally built by humans to human scale.  The next two are idealized theoretical constructs, goals to be approached but rarely achieved.  The Heisenberg Uncertainty Principle (HUP) is fundamental to how the Universe works.

The first three are each in a dynamic equilibrium that is constantly buffeted by competing forces.  The HUP comes straight out of the deep math for where those forces come from.  Kites and work stress and markets may be peculiar to Earth, but the HUP is in play on every planet and star.

In the last post we saw that thanks to the HUP we can precisely identify an oboe’s pitch if it plays forever.  We can know precisely when a pitchless cymbal crashed.  But it’s mathematically impossible to get both exact pitch and exact time for the same sound.  Thank goodness, we can have imprecise knowledge of both quantities and actually play some music.

We determine a pitch (cycles per second) by counting sound waves passing during a given duration — and that limits our knowledge.  We can’t know that a wave has passed unless we see at least two peaks.  Our observation period must be at least long enough to see two peaks.  To put it the other way, the pitch must be high enough to give us at least two peaks during the time we’re watching.  This isn’t quantum mechanics, it’s just arithmetic, but it’s basic to physics.

Mathematically the HUP is as simple as Einstein’s E=mc2 equation, except the HUP is an inequality:

[A-uncertainty] x [B-uncertainty] ≥ h / 4π

where A and B are two paired quantities like pitch and duration.

(That h is Planck’s constant, “the quantum of action,” 6.6×10-34 joule-sec.  That’s a very small number indeed but it shows up everywhere in quantum physics.  To put h in scale, one gram of TNT packs 4184 joules of explosive energy.  TNT has a detonation velocity of 6900 meters/sec and density of 1.60 gram/cm3, so we can figure a 1-gram cube of the stuff would burn for 1.2 microseconds and generate a total action of about 5×10-3 joule-sec.  Divide that by Avagadro’s number to get that one molecule of TNT is good for 10-26 joule-sec.  That’s about 10 million times h.  So, yeah, h is small.)

Back to the HUP inequality.  A and B are our paired quantities.  The standard examples that everyone’s heard of are position and momentum, as in the old physicist joke, “I haven’t a clue where I’m going, but I know how fast I’m getting there.”  For things that are tied to a central attractor like an atomic nucleus, A and B would be angular position and angular momentum.  If you’re into solid-state physics you may have run into another example — the number of electrons in a superconducting current is paired with a metric that reflects the degree of order in the conducting medium.  One more pair is energy and time, but that’s a story for another week.

But what’s in the HUP inequality isn’t A and B, but rather our uncertainty about each.  A billiard ball might be on the lip of the near cup or it can be all the way across the table — HUP won’t care.  What’s important to HUP is whether the ball is here plus/minus one inch, or here plus/minus a millionth of an inch.  Similarly, HUP doesn’t care how fast the ball is going, but it does care whether the speed is plus/minus one inch per second or plus/minus one millionth of an inch per second.  HUP tells us that we can know one of the pair precisely and the other not at all, or that we can know both imprecisely.  Furthermore, even the imprecision has a limit.

We can’t simultaneously know both A and B more precisely than that little teeny h, but some physicists believe h may have been big enough to launch our Universe.

Next week — HUP, two, three, four

~~ Rich Olcott

# Don’t blame Heisenberg

There was the time I discovered that a chemical compound I’d made is destroyed by the light of the spectrometer I was using to study it. The NYT just ran an article about how biologists have a new-tech problem studying animals in the field because a camera drone can scare the critters away (or provoke an attack).  A teacher can’t shut down an ongoing bullying campaign because student chatter stops when they see him coming.  What’s the common thread in these situations?

You probably thought “Heisenberg,” but please don’t dis the poor guy for them.  You may have seen the for-real Heisenberg Uncertainty Principle in action, but only if you’re a physicist or a music-reading percussionist.  Rather, the incidents in the first paragraph are all examples of the Observer Effect, which is completely separate from the work of Werner H.

The confusion arises because the Observer Effect is often used in classroom explanations of the Heisenberg Uncertainty Principle (the HUP).  The Observer Effect could well apply pretty much anywhere there’s an observer and an observee (see photo), which is why research psychologists and police interrogators use one-way mirrors.

By contrast, the HUP is in play in only a few circumstances, chiefly audio and physics labs.  The key is that word uncertainty, because the HUP is all about the limits of our knowledge.  It says that there are certain pairs of quantities where we must trade off knowledge of one against knowledge of the other.  The more precisely we know the value of one, the more uncertain we are about the other one’s value.

Let’s start with sound.  Did you know that sheet music for a drummer doesn’t really use a “proper” staff with keys and all?  Oh, sure, they use a staff, sort of, but the “notes” indicate strokes rather than tones.  Here’s one variant of many notations out there.

Suppose an oboist plays a tone for you, that nice, long “A” that the orchestra tunes to.  (It’s generally the oboe playing that note, by the way, for two reasons.  First, the oboe uses very little air to produce its sound, so the oboist can hold that note much longer than a flautist or trumpeter could.  More important, though, is that the oboe simply isn’t adjustable — everyone else perforce has to re-tune to match up.)  The primary component of that “A” sound should be a wave of 440 cycles per second.

Now suppose the oboist plays that “A” in shorter and shorter bursts — half-note, quarter-note, etc., down to where all that comes out is a blip.  His fingering and embouchure don’t change, so he’s still playing an “A.” However, when the emitted sound wave is very short we can no longer identify the pitch because there aren’t enough cycles there.  We need at least 2 cycles in a known time period to be able to say how many cycles per second the tone has.

Now the oboist switches up an octave (880 cycles per second) with the same burst length.  That gives us twice as many cycles in the blip and we can identify the new pitch.  However, if he cuts the note’s length in half once more, then again we don’t have enough cycles to count.  The shorter the note, the more precisely we know when it sounded, but the less precisely we know what note it was.

A cymbal crash is basically the limiting case.  It has no distinct pitch (or the physicist would say it has a huge number of pitches that all die away after a few cycles).  Rather than tell the percussionist to play an unidentifiably short note, the composer says, “T’heck with it!” and writes an “X” somewhere on the staff.

And vice-versa — at the start of the oboist’s note the sound contained an mixture of other frequencies.  The interlopers eventually died out as the note proceeded.  There will be another mixing when the oboist runs out of breath.  We can only have a really pure tone if the note never starts and never ends — the poor oboist plays that one note forever.

Thank to Heisenberg, we can be confident that even Bach’s well-tempered clavier was imprecise.

Next week — more fun with Heisenberg.

~~ Rich Olcott