Not much going on today. I’m dealing myself a hand of solitaire when I hear a familiar fizzing sound. “Hello, Anne. Good to see you again.”
She’s freshened up that white satin outfit and is looking very good. “Hello, Sy. Busy?”
“Not so’s you’d notice it. What can I do for you?”
“Can’t a girl just drop in when she wants to visit? Playing with real cards, I see. That’s good, but your tens and treys are frozen.”
“That’s the way the odds break sometimes. The elephant‘s in the room.”
“Entropy again? What’s it look like this time?”
“These cards and surprise. How surprised would you be if I were to draw a queen from the stock pile?”
“No queens showing, so some surprised but not very surprised.”
“You know me, I’m a physicist, we put numbers to things. So put numbers to the situation.”
<sigh> “OK, there are 52 cards in the deck and you’ve got … 28 cards in that triangle, so there are 24 left in the stock. Four of them have to be queens. Four out of 24 is one out of 6.”
“Or 17%. And the odds for the queen of hearts?”
“I’m here so it’s 100% until I leave. Oh, I know, you’re talking about the cards. One in 24 or 4%. So I’d be four times as surprised at seeing the heart queen as I would at seeing any of them. Pooh.”
“Now how about the odds of drawing all four queens?”
“One in 24, times one in 23, times one in 22, times one in 21. Whatever, it’s a very small number and I’d be very surprised.”
“Well, here’s where we get another look at the elephant. There’s a definition of entropy that links directly to those percentages AND can handle extremely small ones. What do you know about logarithms?”
“A little. I read your last series of posts.”
“Wonderful, that simplifies things. Let’s start with strange dissociation thought up by Claude Shannon to whom we owe the entire field of information theory. His crucial insight was that he had to distinguish between information and meaning.”
“How can they be different? If I say ‘green’ that means, well, green.”
“It’s all about context. If you’re telling me what color something is, saying ‘green’ is telling me that the thing isn’t white or red or any of the other umm, nine colors I know the names of. But if you’re telling me someone is really inexperienced then I know not to trust them with a complicated task that has to be done right the first time. From Shannon’s point of view, the information is the signal ‘green,’ and the meaning is set by the context.”
“You’re going somewhere with this, I suppose?”
“Mm-hm. In Shannon’s theory, the more surprising the message is, the more information it contains. Remember when you told me that in one of your alternate realities you’d seen me wearing a green shirt? That was a surprise and it told me you’d visited an unusual reality, because I rarely wear green. If you’d told me the shirt was black or grey, that would have been much less surprising and much less informative. Shannon’s trick was in putting numbers to that.”
“You’re just dragging this out, aren’t you?”
“No-no, only two more steps to the elephant. First step is that Shannon defined a particular signal’s information content to be proportional to the negative of the logarithm of its probability. Suppose I’m maybe 1% likely to wear green but equally likely to wear any of the other 11 colors. Each of those colors has a 9% probability. log10(1%) is –2.00, information content is 2.00, but –log10(9%) is only 1.04. By Shannon’s definition when you said ‘green’ in this context, you gave me nearly double the information as any of the other color names.”
“Why’d you use base-10 logarithms?”
“Convenience. It’s easy to figure log10(1%). Information scientists tend to use base-2, physicists go for base-e. Final step — Shannon took the information content of each possible signal, multiplied it by the probability of that signal, added those products together and called it the signal system’s information entropy. For our colors it’d be 2.0+(11×1.04)=13.44. Regardez, voici l’éléphant!”
“Ooo, French!”
“Aimeriez-vous un croissant et un café? My treat at Al’s.”
~~ Rich Olcott