This post was originally an email sent to the Better Questions Email List. For more like it, please sign up – it’s free.

—-
How scared should I be?
—-

For the past few weeks we’ve been discussing risk and uncertainty.

Risk is a situation in which we know the odds, know the payoffs, and can plan accordingly.

Uncertainty is a situation where we don’t know the odds, or are unsure about the payoffs.

Nearly every difficult decision falls into one of these two categories.

Understanding the difference is critical to making wise decisions…

But we screw it up.

All the time.

In fact, we make this error so often that it has a special name:

The Illusion of Certainty.

The illusion of certainty comes in two flavors:

The zero-risk illusion (where risk is mistaken for certainty)…

And the calculable-risk illusion (where uncertainty is mistaken for risk).

These are the two primary ways our maps stop reflecting the territory…

The engine of our mistakes.

Let’s start with the zero-risk illusion.

We encounter the zero-risk illusion whenever we enter a risk situation (with calculable odds of winning or losing)…

But believe that we know for sure what will happen.

A simple example:

You come across a man on the street. He waves you down.

He holds a single playing card in his hand. He shows it to you – an Ace of Spades.

“I’m running a memory experiment,” he explains.

He turns the card face-down in his palm.

“I’ll give you 50$,” he says, 

“if you can name the card in my hand.”

You ponder this.

He showed you the card; you can easily recall it was the Ace of Spades.

Winning the 50$ seems like a sure thing.

You tell the man your guess.

He turns the card over, revealing….a Joker.

What happened?

Well, as it turns out, the man is a magician, and you’re on one of those TV shows where people embarrass the overly-credulous.

Instead of 50$, your co-workers make fun of you for several weeks as your beet-red face is beamed across the country for all to see.

This was a situation of risk that was mistaken for certainty.

You didn’t know he was a magician, and so assumed the card in his palm would be the card he showed you.

You fell victim to the zero-risk illusion.

That might seem a bit far-fetched, though, so let’s examine another scenario where this illusion occurs.

You go for your annual check up and get the usual series of blood tests.

Your doctor enters the room carrying a clipboard. She looks very concerned. She stares at you and sighs.

“I’m sorry,” she says, “but you have Barrett’s Syndrome. It’s an incredibly rare condition characterized by having a gigantic brain and devastatingly high levels of attractiveness…

…There is no known cure.”


Is the room spinning? you think. Your skin feels flush.

“What’s the prognosis, doc?”

She looks you right in the eyes. She appears both empathetic and strong. She’s good at this, you think.

The average life expectancy of someone with Barrett’s Syndrome is…
….8 months.”

Some of you may remember this scene from an earlier email (titled “8 Months to Live”) in which we discussed the importance of individual vs. group indexing.

For the moment, though, let’s forget that discussion.

What if you had asked a different question?

Let’s go in a new direction this time.

We rejoin our scene, already in progress:

“The average life expectancy of someone with Barrett’s Syndrome is…
….8 months.”

You pause. You furrow your brow.

“Not good, doc. Can I ask – how sure are we? How good is this test?”

The doctor nods, as if she understands why you would ask, but that look of sympathy-slash-pity hasn’t left her face.

“I’m sorry, I know you’re looking for some hope right now…but the test is extremely accurate. The test is nearly 100% accurate in detecting the disease when it’s present, and the false positive rate is only 5 in 100,000.

“Hmmmm. Doesn’t sound good for me, I guess. Let me ask you, doc – exactly how many people have this syndrome? Maybe I can join a support group.”

“The disease is very rare. Only .0001% of the population has Barrett’s syndrome.”

She clearly expects you to be resigned to your fate.

Instead, you are….smiling?

How scared should you be?

We trust so much in technology that it can cause us to fall victim to the zero-risk illusion.

Because we believe medical science to be accurate, receiving a positive test result for something as devastating as Barrett’s Syndrome can cause extreme anxiety.

But let’s think about those numbers a bit.

Let’s start with a random selection of 100,000 people.

Based on what we know about Barrett’s Syndrome, how many people in this population should have the disease?

Based on that .0001% number, we’d expect ten people to have Barrett’s Syndrome in a population of 100,000.

Because the test is very accurate in detecting the disease, we’d expect all those people to test positive.

Our false positive rate was 5 out of 100,000, which means that out of our group of 100,000 we should also expect 5 people to test positive that don’t have the disease.

That means that we have 10 real positives….and 5 false ones.

So if you test positive for Barrett’s Syndrome, you’ve got a 2-to-1 chance of having the disease.

Not great, but not certain, either.

This scenario, while hypothetical, plays out every day across the country.

Routine medical screenings with seemingly low false-positive rates produce far higher numbers of wrong diagnoses than it might seem – simply because of the scale at which they’re administered.

In this situation, we have a risk – about 2-to-1 – of having Barrett’s Syndrome. But that risk seems like certainty.

The other form of the illusion of certainty is the calculable-risk illusion…and it’s the one which feels most appropriate to our current global situation.

The calculable-risk illusion occurs when we fool ourselves into thinking that we know the odds.

We trick ourselves into believing we know more than we do, that we’re making a rational calculation…

When, in reality, no “rational calculation” is possible.

Donald Rumsfeld put this quite famously:

Reports that say that something hasn’t happened are always interesting to me, because as we know, there are known knowns; there are things we know we know.

We also know there are known unknowns; that is to say we know there are some things we do not know.

But there are also unknown unknowns—the ones we don’t know we don’t know. And if one looks throughout the history of our country and other free countries, it is the latter category that tend to be the difficult ones.

The calculable-risk illusion occurs when we mistake known unknowns for unknown unknowns.

This – to use the framework we’ve been developing over the past few weeks – is when we need to leave the world of statistical analysis and enter the world of game theory….

To stop playing the game and start playing the players.

Failure to do so can have extremely dire consequences.

My favorite example of this is called the “Turkey Problem,” and comes from Nassim Taleb’s Antifragile, which I’ll quote here:

A turkey is fed for a thousand days by a butcher; every day confirms to its staff of analysts that butchers love turkeys “with increased statistical confidence.”

The butcher will keep feeding the turkey until a few days before Thanksgiving. Then comes that day when it is really not a very good idea to be a turkey.
So with the butcher surprising it, the turkey will have a revision of belief—right when its confidence in the statement that the butcher loves turkeys is maximal and “it is very quiet” and soothingly predictable in the life of the turkey.

“The absence of evidence is not evidence of absence.” The turkey predicts with “increasing certainty” that the farmer will continue to feed him, because he has no evidence it will ever be otherwise.

Because negative events happen less often does not mean they are less dangerous; in fact, it is usually the opposite. Don’t look at 
evidence , which is based on the past; look at potential. And don’t just look for “evidence this theory is wrong;” is there any evidence that it’s right?

The turkey is not wrong in saying that it’s life is quiet and peaceful; after all, all historical evidence tells him this is so.

His error was mistaking uncertainty for risk, believing he understood the odds.

If you’ve seen me rant on Twitter or in my various Live Streams about “epistemic humility…”

This is why.

It is not purely a moral concern:

Humility in the face of a complex universe is a means of survival.

Because:

There are known knowns…

And known unknowns…

But it’s in the Unknown Unknowns…

…that the universe hides its hatchets.

0
(Visited 148 times, 1 visits today)