We all suffer from varying amounts of ignorance. The standard approach to moving forward is to learn, both in theory and, practically, from your mistakes. What entrepreneur can expect a business to grow without taking some lumps and getting smarter from the experience?
The problem is actually learning from those mistakes. It turns out that in addition to the Dunning-Kruger effect, which shows that people are bad at realizing what they do know and what they don't, there's another shared cognitive impairment: hindsight bias. According to Nick Chater, a professor of behavioral science at Warwick University in the U.K. who appeared on the BBC Radio program The Human Zoo, we all suffer from an inclination to revisionist intellectual history.
Under hindsight bias, we ascribe to ourselves an ability to forecast the future far beyond our natural gifts of divination. We look back at the past and, no matter what we thought of things at the time, suddenly, after the fact, we're sure we saw it coming. A perfect example is the legion of talking heads who mutter about politics coming up to an election. After the public has cast its collective decision and the outcome is different from what the wizened visages knew would happen, these same floating noggins suddenly talk as though they saw it coming all along.
"One of the things wrong with it is it makes us misunderstand the past," said Chater, "and perhaps, thereby, misunderstand the way the future will work." He referred to a speech given by a British politician that was heralded as a clear precipitating disaster after an election. However, when the speech occurred, there was relatively little harsh criticism. "Of course, if we misunderstand the past, that's going to change where we put our emphasis in thinking about future political campaigns, for example," Chater said.
To put it a bit differently, we look at the outcome of events and then switch our allegiance to arguments that seem to explain the results. But every outcome is a matter of probability. Short of dropping a hammer in a positive gravity field in the middle of an isolated place with barriers to prevent any interference, certainty is almost never in the cards.
Peter Ayton, a professor of psychology at the City University of London, said another name for this is the "I thought I knew it all along" effect, or creeping determinism. There's a lack of surprise--the sort of sage nodding you'll see among many in business, government, and media, looking back on something that happened. "Once you see what happened, you thought you kind of knew what was going to happen," Ayton said. "It's very resistant to elimination. Because you know, you're unable to see how likely it would have been without knowing what the outcome was."
Researchers Neal Roese of the Kellogg School of Management at Northwestern University and Kathleen Vohs of the Carlson School of Management at the University of Minnesota reviewed the literature on the subject and found there are three issues thought to be at the root of hindsight bias:
- We forget what we felt and fall prey to memory distortion.
- We believe in the inevitability of events. We become ancient Greeks and look to the three Fates as having decided everything in advance.
- We also believe in foreseeability, that we're capable of predicting the future.
The problem with hindsight bias is that, if you think you already knew what was going to happen, there's nothing left to learn. The opportunity for profiting from mistakes can go sailing out the window.
So how do you get around hindsight bias and open the possibility of learning from the past and getting smarter in the future? Roese and Vohs suggest actively considering the opposite--"consider and explain how outcomes that didn't happen could have happened." By considering information we would have thrown out because it didn't conform to the outcome, we gain a better view of the past.