I once worked at a plant where a sinkhole formed in the parking lot. We decided to bring in a geologist to test for sub-surface issues under the building and recommend fixes. 

In the middle of the geologist's highly technical presentation that laid out his findings, our CEO jumped up and scrawled a few lines on the map, nodded decisively, and said, "There. Clearly that's all we'll need to fix."

When the geologist started to respond, the CEO said, "Thanks for your time." He nodded dismissively towards the door and turned to face the rest of us.

"Next topic," he barked.

Now, our CEO was a smart guy. He knew a lot about a lot of things. Yet he knew nothing about sinkholes or geology or the chemical dissolution of carbonate rocks by suffosion processes. What he did know is that he didn't want to pay to fix a massive sinkhole problem, so clearly the problem must be minor.

He was motivated to come to a certain conclusion, and smart enough to think highly of his opinion.

Which makes sense. Smart people are naturally better at constructing convincing arguments that support things they believe--or sometimes even just want--to be true. Smart people are better at "gist reasoning," using intuition formed by experience to cut through a haze of complicated details to get to the heart of a matter. 

Smart people are more confident in their judgments--are more certain they're right--simply because they are smart. (Even though Jeff Bezos says a sign of high intelligence is a willingness to change your mind. A lot.)

So yeah: Because our CEO was motivated to land on a certain answer, he favored information that confirmed what he already believed and ignored data that did not. (Say hi to confirmation bias.)

In short, he was too smart to question his decision.

He's not alone. A 2018 study showed that people with high SAT scores tend to be much less likely to analyze and learn from their mistakes. In hospitals, diagnostic errors were found to contribute to approximately 10 percent of patient deaths and up to 17 percent of all harmful events.

Research detailed in David Robson's The Intelligence Trap shows that the smarter people are, the more likely they are to be unaware of their own flawed thinking.

In their defense, smart people come by all this naturally. In school, most tests have time limits, especially standardized tests.

Since speed clearly matters, the faster you are, the smarter you must be--even though few real-life situations require extremely rapid decisions. Finding the right answer matters, not how long it took to get there.

In business, people who can make quick decisions are assumed to be better leaders. Who make bold decisions. Who stick to their guns, even when--especially when--others doubt them.  

That's what makes them great.

But not always.

It's easy to assume people who think fast are smarter than those who think more deliberately, no matter how deeply the "slow" thinkers actually think.

Still, even though making quick decisions can sometimes be useful in the short term, what matters more is making thoughtful and wise decisions that have long-term impacts.

How to Be Smart, and Smart

Some people manage to overcome the triple threat of motivated reasoning, confirmation bias, and natural overconfidence to not be too smart for their own good, and to harness their intelligence to make smart decisions.

For example, Jeff Bezos doesn't spend a lot of time weighing the pros and cons of easily reversible decisions. Oprah Winfrey swears by deciding which bridges to cross and which to burn. Steve Jobs said deciding when to trust yourself will make all the difference in your life.

Other approaches? One is to pretend another person is in the situation you face. Stanford researchers found that simply encouraging yourself to view a situation from a detached perspective improves outcomes. Just pretend you need to walk someone else through the reasoning behind a decision.

Not the decision itself. The reasoning.

Another is to take a moment to question your instincts. Say, "I think this is right...but why do I think it's right?" In a 2005 Journal of Internal Medicine study, doctors given challenging clinical scenarios were only able to come up with a correct diagnosis two-thirds of the time. 

But as Robson's book shows, doctors who were asked to analyze their initial diagnosis and consider alternative possibilities found their diagnostic accuracy improved by as much as 40 percent.

And possibly the best approach of all: Listen before you speak, think before you speak, and seek input from people in a better position to know--inside or outside your organization.

Because holding a position of authority does not automatically confer wisdom.

But being willing to admit you may not have the right answer--at least not yet--is a great first step in that direction.