Have you ever had an argument with a friend, relative, or co-worker who remains certain of their position, even after you provide irrefutable proof that you're right and they're wrong? It's frustrating, but common. It turns out the human brain is wired that way.

An outraged friend once showed me a picture of Gordo, one of the first monkeys in space, along with a brief article explaining that after his flight into orbit in 1958, his capsule dropped into the ocean and sank, as NASA had designed it to do. I was immediately skeptical. Even if NASA didn't care about the welfare of its experimental primate astronauts (and there's plenty of evidence it didn't) both the capsule and Gordo's body would have been valuable resources.

After just a few minutes of internet research, I presented my friend with an article saying what really happened--Gordo's loss was caused by a parachute malfunction. NASA searched for his capsule for six hours before giving up. But despite these press accounts and what I thought was obvious logic, my friend remained unconvinced. She still thought NASA might have drowned Gordo on purpose.

It turns out there's a scientific explanation, or actually several scientific explanations for this, as explained in a fascinating New Yorker article. Experiments over the years have proved again and again that once we form an opinion, it's difficult for us to change, even after learning that the information we relied on was false.

Then there's confirmation bias, the human tendency to lend more weight to information that supports what we already believe, and less weight to information that contradicts it. Confirmation bias is so hard-wired into us that we may actually get a rush of dopamine (a pleasure hormone) when we encounter information that confirms what we already believe.

Then there's something else--something that evolved as part of our very survival. Experiments have shown that we are eager to hold the same opinions as other members of our social group. This is almost certainly because throughout our history as hunter-gatherers (and still today) agreeing with our social group and being wrong is often safer than disagreeing and being right.

Take these facts together and it's obvious why humans are less logical than we like to believe, and more likely to make decisions and form opinions for irrational reasons. But is there anything we can do about it?

The answer is--maybe. Although we'll always remain profoundly irrational creatures, we can at least try to counteract our tendency to continue believing what we already believe, or allow our friends to influence our own thinking. 

When evaluating new information or trying to form an opinion, ask yourself these questions. Or ask them of someone who disagrees with you to see if one of you can change the other's mind.

1. Does this agree with something I already believe?

If yes, watch out for confirmation bias and that sneaky dopamine rush. It won't be fun--in fact it'll be hell--but you should probably give more weight to data that contradicts what you think you know, and less to data that seems to support it.

2. Does it agree with the opinions in my social group (or someone I admire)?

If so, that's another good reason to be somewhat skeptical. I myself have mindlessly adopted all kinds of opinions on topics from gun control to abortion because they fit with what the people around me believed, or with people that I generally agreed with. If everyone around you believes something--anything--the pressure on you to believe it as well will be very strong. See if you can resist that pressure and form your own opinion.

3. How much do I really know about this topic?

Most of us think we know more than we do. Researchers at Yale proved this point by asking graduate students to write detailed explanations of exactly how everyday objects such as zippers and toilets actually work. It's worth making the time for more study, not about toilets and zippers but about things we have definite opinions about, such as Obamacare and the stock market.

Experts observe that the more people know about something, the less likely they are to have strong opinions about it. In one survey after Russia annexed the Crimea that had previously been part of Ukraine, Americans were asked how the U.S. should respond. Those most in favor of military action were also least likely to be able to find Ukraine on an unmarked map. 

4. Can I explain myself?

This is often a very good way to test the validity of your strongest opinions. In a 2012 study, people were asked questions about political proposals such as a single-payer health care system. Once they'd expressed their opinions, they were asked to explain in as much detail as they could what the effects would be if the proposal were implemented. Most were forced to realize that they didn't fully know--and their opinions became less firm as a result. 

Next time you find yourself locking horns with a friend or family member over political or other issues, try asking them for a detailed explanation, or else ask yourself for one. It might not be enough to change anyone's mind. But you never know.