To Jeff Bezos, there are two basic types of decisions. Type 1 decisions are almost impossible to reverse; he calls them one-way doors. Type 2 decisions are two-way doors. Make a bad Type 2 decision? Painful, sure, but also possible to overcome.  

Even so, there are plenty of ways to make bad decisions. (Trust me: I should be the patron saint of poor judgment.) Obviously that's a problem, since success or failure often comes down to the quality of the decisions you make.

So why do even the smartest people -- and the smartest companies -- sometimes make poor decisions? The authors of a 2020 paper published in Academy of Management Journal (h/t to professor Ethan Mollick) grouped decision-making errors into seven categories.

Including (foreshadowing alert!) our new term for the day: the iatrogenic cascade.

Here we go:

Type 1: Assuming a connection exists.

When Netflix infamously decided to split streaming and DVD rentals into two subscriptions, it assumed customers wouldn't mind since both services were clearly a bargain. According to one account, founder Reed Hastings felt some customers would complain, but that the number would be small, and the anger would quickly fade.

Nope. Netflix lost an estimated 800,000 subscribers, and its stock price fell 77 percent. 

The authors cite another example of a Type 1 error: Justifying the Iraq War on weapons of mass destruction without proof WMDs actually existed.

Think of a Type 1 error as an assumption error: Thinking you have more information than you actually do.

Type 2: Ignoring a connection that does exist.

This one is a little trickier. A Type 2 error occurs when a relationship between two things exists, but is not accounted for. 

One example the authors cite involves cycling, when antidoping authorities assumed performance-enhancing drugs did not influence the outcome of events like the Tour de France. 

Another is New Coke: When Coca-Cola ran extensive taste tests with people who didn't drink Coke, the majority chose Coke over Pepsi. But Coke didn't run extensive taste tests with Coke drinkers -- and it turned out they greatly preferred "old" Coke to New Coke.

The same could happen if Apple tested only a new iPhone by using non-cell phone users (if they could find any). Those people might prefer an iPhone to an Android device -- but current iPhone users may hate the new model.

Why do Type 2 errors occur? Cognitive bias. Faulty testing. Poor analysis.

Thinking you already know enough.

Type 3: Solving the wrong problem.

Or as the authors put it, "solving the wrong problem very precisely." Why the distinction? Because the better you make the solution, the more likely you are to believe in your decision to create that solution. 

I could (if I were smart enough) create a groundbreaking device capable of using a photograph to determine the thickness of an orchid petal to within 0.00001 inch. Yay, me -- but who cares. 

What causes Type 3 errors? Solving the outcome of a problem, rather than the problem itself. Falling in love with a solution without knowing if there's actually a problem (or if people care about the problem).

Thinking you know more about what people want, or need, than you really do.

Type 4: Solving the right problem the wrong way.

You need great employees. So you hire only people with degrees from Ivy League schools.

Unfortunately, it's fairly easy to choose the wrong solution. That's especially true if you're trying to solve an old problem in a new way; often the only way to truly know whether you made the right decision is to try, and possibly fail.

In that case, the key is to realize it's time to stop doing something that clearly doesn't work.

Type 5: Mistaking action for progress.

"Look, we have to do something" often sounds good--a bias toward action often separates those who succeed from those who merely dream.

But still: Sometimes doing nothing -- at least until you have enough data to make a reasonably informed decision -- is the better course. That's especially true if the choice you make, and the action you take, can spark a ripple effect. Deciding to pull the trigger on a price increase could potentially increase margins, but could also cause "purchased loyalty" customers to look elsewhere, which could decrease overall revenue, which could put greater pressure on inventory and service capacity...

You get the point. Doing something may feel good, but it doesn't mean you're doing the right thing.

Type 6: Mistaking inaction for patience or wisdom.

On the flip side, most of the time problems don't solve themselves. Nor do they simply go away. 

But they do tend to get worse.

How can you decide whether action or inaction is appropriate? One easy way is to consider what you know, or don't know. Inaction may be the right course when you don't know how to solve a problem. But if you do know how to solve a problem, but are hesitant -- usually because of laziness, conflict-aversion, or "that's not my job and I shouldn't have to" --  that's a sign choosing to act is the right decision to make. 

Otherwise you'll become what the authors term a "wishful non-action taker."

Nobody wants that.

And now for the big one.

Type 7: The Cascade Iatrogenic Error

Type 7 errors are compound errors. In researcher-speak, "cases of cascade iatrogenesis are those in which erroneous action allows forces to interact that were erroneously analyzed, creating more problems than it solves, creating an environment in which original issues morph into larger, and qualitatively different, often irreversible outcomes."

Or more simply put, one bad decision leads to other bad decisions, and a Bezos two-way door now only swings one way.

For example, say I was in charge when Netflix split subscriptions into streaming and DVD. I could have done what Netflix did and eventually staggered back through the two-day door.

Or I could have solved a problem that didn't exist by creating a pricing scheme for streaming based on hours viewed per unit time. Or I could have decided the debacle was actually a marketing problem and sunk millions into trying to convince customers that what they really wanted -- if they would only (freaking) listen -- was a two-tiered pricing scheme.

Or I could have made a Type 6 error and just sat waited until millions of customers came to their senses and realized I was right all along.

While I flailed around and compounded the problem, the result could have been catastrophic. More customers could have canceled their subscriptions. Lower revenue could have halted  plans to create original content, and postponed infrastructure investments to support higher streaming volume, and caused a talent drain due to layoffs and resignations. (Because the best employees always have options.)

Would (my) Netflix have recovered from the resulting downward spiral? Maybe.

But maybe not.

How can you avoid a cascade iatrogenic error? Since they typically result from several internal decisions and external forces, a cascade is hard to predict.

Ask Yourself a Few Questions

But you can try to avoid making the first six types of errors. 

Before you make an important decision, ask yourself a few simple questions:

  • "What if I add certain data, information, or input to the mix? How would that change my decision?" Maybe you should ask a few of the employees who will actually implement a process change for their input.  
  • "What if I eliminate certain data, information, or input from consideration? How would that change my decision?" If you later realized the results of a customer survey were flawed, would you make the same decision?
  • "No matter how much I love the solution (or tools, or technology, or just my idea), will enough customers actually pay for it?" A solution people won't want, need, or pay for isn't a solution. It's an expensive hobby.
  •  "Am I doing (this) because I feel I have to do something? Or because people expect me to do something?" A bias for action can be the quickest way to make a bad situation worse.
  • "Am I not doing (this) because it's smart, or because it's hard?" Often the right thing to do is also the hardest thing to do -- but doing nothing now only makes it even harder later.

And if those don't help, try this. Say you're about to announce a decision. But instead of falling back on your authority to justify it -- because after all, you're the boss and what you say goes -- imagine you have to justify your decision. You have to walk people through your data and analysis.

You can't just dictate; you have to convince, using facts and figures and logic and reasoning.

In that case, would you make the same decision?

If not, don't.

Because you might just be making a decision that leads to cascade iatrogenesis.

And nobody wants that.