A friend is convinced shark attacks occur a lot more often than in the past.
Maybe the cause is global warming. Or a decline in food sources. Or changes in migratory patterns. While unsure of the reason, he's convinced it's not safe to go in the water. And that someone, somewhere, needs to get their butts in gear and do something about those (darned) sharks.
So for fun I sent him a report from University of Florida researchers showing that 2017 was an "average" year for shark attacks, with 88 unprovoked attacks and 5 fatalities.
His response? Not only did he not believe it, he got mad. "That report is a bunch of bull---," he said. "How dare they manipulate the data to suit their own ends?" (He's a good guy, but recognizing irony is not his strong suit.)
Not long after, he sent me news of a man who was killed in a shark attack off Cape Cod.
"See?!" he said. "Now who's right?!"
In one way, he is right. Shark attacks do happen. Sharks kill people every year. But that doesn't mean the rate of shark attacks is increasing.
So why does he think so? Part of the problem lies in our access to information. When something happens--especially something we care about or focus on--we know. My friend actively looks for news about sharks. He knows whenever an attack has occurred.
Which makes confirmation bias even more prevalent.
Confirmation bias is our tendency to look for and favor data that backs up what we already believe--and to avoid or look poorly on data that goes against what we already believe.
If I think people love my new book, I'll pay close attention to great feedback...and ignore any negative feedback. If he thinks shark attacks are up, he'll look for reports of attacks...and ignore historical data that disproves his theory.
Confirmation bias starts with forming a hypothesis--shark attacks are up, people love my book--and then seeking out data to support that hypothesis. The more strongly you feel about your hypothesis, the more likely you are to fall prey to confirmation bias.
That's why my friend won't go in the water, even though his chances of getting killed by a shark are around 1 in 264 million. (Statistically, he's more likely to be struck by lightning or get injured by a toilet.)
Why, the more strongly we feel about our particular hypothesis, are we more likely to fall prey to confirmation bias?
For one thing, confirmation bias makes us feel smart. If I think doing 100,000 pushups in a year is a good idea and then I see an incredibly fit guy or gal doing pushups, that confirms my hypothesis--even though there are countless ways to get stronger and fitter. If I believe in pushups and I see someone fit doing pushups...boom: I'm right.
That makes me feel smart.
And that also makes me feel good.
In their book, Denying to the Grave: Why We Ignore the Facts That Will Save Us, Jack and Sara Gorman describe research that suggests we get a rush of dopamine--the neurotransmitter that makes us feel good--when we find information that supports a belief.
When my friend hears about a shark attack, he gets a rush of dopamine. In a perverse way, it feels good, both intellectually and physically.
It's almost like he can't help it. It's almost like we can't help it. Our egos make us want to think we're right...and our bodies want to feel we're right.
Which makes confirmation bias really hard to avoid, since, as the Gormans write, "It feels good to 'stick to our guns' even if we are wrong."
That's why, if you want to make smarter decisions, don't look for ways to prove to yourself that you're right. Don't look for reasons that you should stick to your guns.
Realize that your mind--and your body--will sometimes betray you.
Look at all the data. Then make a decision.
And then be willing to revisit that decision when you get new data.
We all want to be right--and sometimes the easiest way is to stop getting at least a few things wrong.