Unlikely Optimism: The Conjunctive Events Bias

When certain events need to take place to achieve a desired outcome, we’re overly optimistic that those events will happen. Here’s why we should temper those expectations.

***

Why are we so optimistic in our estimation of the cost and schedule of a project? Why are we so surprised when something inevitably goes wrong? If we want to get better at executing our plans successfully, we need to be aware of how the conjunctive events bias can throw us way off track.

We often overestimate the likelihood of conjunctive events—occurrences that must happen in conjunction with one another. The probability of a series of conjunctive events happening is lower than the probability of any individual event. This is often very hard for us to wrap our heads around. But if we don’t try, we risk seriously underestimating the time, money, and effort required to achieve our goals.

The Most Famous Bank Teller

In Thinking, Fast and Slow, Daniel Kahneman gives a now-classic example of the conjunctive events bias. Students at several major universities received a description of a woman. They were told that Linda is 31, single, intelligent, a philosophy major, and concerned with social justice. Students were then asked to estimate which of the following statements is most likely true:

  • Linda is a bank teller.
  • Linda is a bank teller and is active in the feminist movement.

The majority of students (85% to 95%) chose the latter statement, seeing the conjunctive events (that she is both a bank teller and a feminist activist) as more probable. Two events together seemed more likely that one event. It’s perfectly possible that Linda is a feminist bank teller. It’s just not more probable for her to be a feminist bank teller than it is for her to be a bank teller. After all, the first statement does not exclude the possibility of her being a feminist; it just does not mention it.

The logic underlying the Linda example can be summed up as follows: The extension rule in probability theory states that if B is a subset of A, B cannot be more probable than A. Likewise, the probability of A and B cannot be higher than the probability of A or B. Broader categories are always more probable than their subsets. It’s more likely a randomly selected person is a parent than it is that they are a father. It’s more likely someone has a pet than they have a cat. It’s more likely someone likes coffee than they like cappuccinos. And so on.

It’s not that we always think conjunctive events are more likely. If the second option in the Linda Problem was ‘Linda is a bank teller and likes to ski’, maybe we’d all pick just the bank teller option because we don’t have any information that makes either a good choice. The point here, is that given what we know about Linda, we think it’s likely she’s a feminist. Therefore, we are willing to add almost anything to the Linda package if it appears with ‘feminist’. This willingness to create a narrative with pieces that clearly don’t fit is the real danger of the conjunctive events bias.

“Plans are useless, but planning is indispensable.” 

— Dwight D. Eisenhower

Why the best laid plans often fail

The conjunctive events bias makes us underestimate the effort required to accomplish complex plans. Most plans don’t work out. Things almost always take longer than expected. There are always delays due to dependencies. As Max Bazerman and Don Moore explain in Judgment in Managerial Decision Making, “The overestimation of conjunctive events offers a powerful explanation for the problems that typically occur with projects that require multistage planning. Individuals, businesses, and governments frequently fall victim to the conjunctive events bias in terms of timing and budgets. Home remodeling, new product ventures, and public works projects seldom finish on time.”

Plans don’t work because completing a sequence of tasks requires a great deal of cooperation from multiple events. As a system becomes increasingly complex, the chance of failure increases. A plan can be thought of as a system. Thus, a change in one component will very likely have impacts on the functionality of other parts of the system. The more components you have, the more chances that something will go wrong in one of them, causing delays, setbacks, and fails in the rest of the system. Even if the chance of an individual component failing is slight, a large number of them will increase the probability of failure.

Imagine you’re building a house. Things start off well. The existing structure comes down on schedule. Construction continues and the framing goes up, and you are excited to see the progress. The contractor reassures you that all trades and materials are lined up and ready to go. What is more likely:

  • The building permits get delayed
  • The building permits get delayed and the electrical goes in on schedule

You know a bit about the electrical schedule. You know nothing about the permits. But you bucket them in optimistically, erroneously linking one with the other. So you don’t worry about the building permits and never imagine that their delay will impact the electrical. When the permits do get delayed you have to pay the electrician for the week he can’t work, and then have to wait for him to finish another job before he can resume yours.

Thus, the more steps involved in a plan, the greater the chance of failure, as we associate probabilities to events that aren’t at all related. That is especially true as more people get involved, bringing their individual biases and misconceptions of chance.

In Seeking Wisdom: From Darwin to Munger, Peter Bevelin writes:

A project is composed of a series of steps where all must be achieved for success. Each individual step has some probability of failure. We often underestimate the large number of things that may happen in the future or all opportunities for failure that may cause a project to go wrong. Humans make mistakes, equipment fails, technologies don’t work as planned, unrealistic expectations, biases including sunk cost-syndrome, inexperience, wrong incentives, changing requirements, random events, ignoring early warning signals are reasons for delays, cost overruns, and mistakes. Often we focus too much on the specific base project case and ignore what normally happens in similar situations (base rate frequency of outcomes—personal and others). Why should some project be any different from the long-term record of similar ones? George Bernard Shaw said: “We learn from history that man can never learn anything from history.”

The more independent steps that are involved in achieving a scenario, the more opportunities for failure and the less likely it is that the scenario will happen. We often underestimate the number of steps, people, and decisions involved.

We can’t pretend that knowing about conjunctive events bias will automatically stop us from having it. When, however, we are doing planning where a successful outcome is of importance to us, it’s useful to run through our assumptions with this bias in mind. Sometimes, assigning frequencies instead of probabilities can also show us where our assumptions might be leading us astray. In the housing example above, asking what is the frequency of having building permits delayed in every hundred houses, versus the frequency of having permits delayed and electrical going in on time for the same hundred demonstrates more easily the higher frequency of option one.

It also extremely useful to keep a decision journal for our major decisions, so that we can more realistic in our estimates on the time and resources we need for future plans. The more realistic we are, the higher our chances of accomplishing what we set out to do.

The conjunctive events bias teaches us to be more pessimistic about plans and to consider the worst-case scenario, not just the best. We may assume things will always run smoothly but disruption is the rule rather than the exception.