When experiments are run in which subjects are asked to pick an answer they are 99% confident in, they are usually wrong 25% or more of the time. Overconfidence like this undermines effective decision-making and has real-world consequences. Studies have shown, for instance, that when doctors say they are "completely confident" of a diagnosis, they are wrong about 40% of the time.
It is part of human nature to be overconfident, because believing that we understand what's going on gives us a feeling of security. Being confident comforts us with the (usually incorrect) conviction that we are in control of our own environment.
According to Dan and Chip Heath, in their book Decisive: How to Make Better Choices in Life and Work, overconfidence ranks as one of the four principal "villains" of good decision-making, along with narrow framing, short-term emotion, and the confirmation bias.
Overconfidence can be compensated for simply by imagining realistic future scenarios in which your decision turns out to have been wrong. Imagine, for instance, that a year or more after your decision you were to look back and realize that your plans have been completely undone and you're now kicking yourself for having made such a foolish choice. Or imagine that the reverse happens - that you are unexpectedly elated with the decision, and perhaps wish you had gone even farther with it. Now try to write the stories behind either of these futures - how would either one have happened?
Narrow framing occurs whenever you box yourself in to some set of options or possibilities that is unnecessarily limited. Should our company buy this other firm or not? Should I attend university out of state or in state? Should our family book a holiday trip or stay at home? The most immediate cure for narrow framing is to widen the set of options being considered - to change the parameters of the problem itself. Figure out how to state the problem in a larger context. Rather than asking "Should we buy a new car or not?" you could ask yourself "What's the best way right now to spend this amount of money making my family better off?"
Short-term emotion, a third villain, can steer any decision in a terribly wrong direction because of the immediacy of the reward, or the "heat of the moment." The only real solution is to try to step back from the emotion of the moment and consider the decision from a distance. And one sure-fire way to step back from the issue being decided, in order to take a longer-term view, is what Suzy Welch has called the "10-10-10" exercise. How will this decision you're about to make feel in 10 minutes? OK, how about how it will feel 10 months from now? And in 10 years, what will we remember about it, and what will be most important after a decade has elapsed?
Confirmation bias, a well-known affliction in all human deliberation, is the fourth villain, according to the Heaths. This consists of our natural tendency to look for evidence to confirm our own existing belief, rather than looking specifically for any evidence that might disconfirm it, or indicate that our thinking is wrong. Decision-making analysts consider the confirmation bias to be one of the single most difficult human thinking flaws to deal with, and it afflicts even very smart, highly sophisticated thinkers. We are wired as social animals, and much of our analytical thinking likely evolved in order to help us persuade others to do what we want them to do. So when we think we are analyzing a problem objectively, what our brains are actually doing is searching for ways to buttress our argument in favor of the desired conclusion. And in the end, we are not just trying to persuade others, we are also persuading ourselves.
The only reliable cure for the confirmation bias is having the self-discipline to think more like a scientist. The scientific method is based on the principle of "falsifiability." A theory about how the world works can only be considered truly scientific if there are some conditions or events that would prove it to be false. Falsifiability is what separates scientific reasoning from religious belief, superstition, or prejudice.
As recently as the 1500's many thought heavy objects fell at higher speeds than lighter objects. Galileo proved that this was false with a simple experiment when he dropped two differently weighted spheres from the top of the Tower of Pisa, showing that they hit the ground simultaneously. Without air resistance, even a feather would have fallen at the same rate, as was famously demonstrated by Apollo astronaut David Scott on the surface of the moon, where there is gravity, but no air.
In practice, overcoming the confirmation bias means avoiding coming to any conscious conclusion prematurely. With the increasingly prevalent practice of "evidence-based medicine," for instance, doctors are encouraged to try not to come to any conclusion at all about a patient's diagnosis before objectively reviewing the evidence itself, including any literature or relevant research.
In a group or a corporate setting, one effective way to overcome the confirmation bias is by setting different group members to try to prove conflicting points of view. Warren Buffett's practice, when considering a merger or some other deal, has sometimes involved hiring two different sets of bankers. One set of bankers earns a high fee if the deal is consummated successfully, while the other set of bankers earns a high fee if the deal is not consummated.
If you want to make better decisions, you could do worse than focusing on these four afflictions - overconfidence, narrow framing, short-term emotion, and the confirmation bias - and taking the steps necessary to minimize them.