Tag: Human misjudgment

Unlikely Optimism: The Conjunctive Events Bias

When certain events need to take place to achieve a desired outcome, we’re overly optimistic that those events will happen. Here’s why we should temper those expectations.

***

Why are we so optimistic in our estimation of the cost and schedule of a project? Why are we so surprised when something inevitably goes wrong? If we want to get better at executing our plans successfully, we need to be aware of how the conjunctive events bias can throw us way off track.

We often overestimate the likelihood of conjunctive events—occurrences that must happen in conjunction with one another. The probability of a series of conjunctive events happening is lower than the probability of any individual event. This is often very hard for us to wrap our heads around. But if we don’t try, we risk seriously underestimating the time, money, and effort required to achieve our goals.

The Most Famous Bank Teller

In Thinking, Fast and Slow, Daniel Kahneman gives a now-classic example of the conjunctive events bias. Students at several major universities received a description of a woman. They were told that Linda is 31, single, intelligent, a philosophy major, and concerned with social justice. Students were then asked to estimate which of the following statements is most likely true:

  • Linda is a bank teller.
  • Linda is a bank teller and is active in the feminist movement.

The majority of students (85% to 95%) chose the latter statement, seeing the conjunctive events (that she is both a bank teller and a feminist activist) as more probable. Two events together seemed more likely that one event. It’s perfectly possible that Linda is a feminist bank teller. It’s just not more probable for her to be a feminist bank teller than it is for her to be a bank teller. After all, the first statement does not exclude the possibility of her being a feminist; it just does not mention it.

The logic underlying the Linda example can be summed up as follows: The extension rule in probability theory states that if B is a subset of A, B cannot be more probable than A. Likewise, the probability of A and B cannot be higher than the probability of A or B. Broader categories are always more probable than their subsets. It’s more likely a randomly selected person is a parent than it is that they are a father. It’s more likely someone has a pet than they have a cat. It’s more likely someone likes coffee than they like cappuccinos. And so on.

It’s not that we always think conjunctive events are more likely. If the second option in the Linda Problem was ‘Linda is a bank teller and likes to ski’, maybe we’d all pick just the bank teller option because we don’t have any information that makes either a good choice. The point here, is that given what we know about Linda, we think it’s likely she’s a feminist. Therefore, we are willing to add almost anything to the Linda package if it appears with ‘feminist’. This willingness to create a narrative with pieces that clearly don’t fit is the real danger of the conjunctive events bias.

“Plans are useless, but planning is indispensable.” 

— Dwight D. Eisenhower

Why the best laid plans often fail

The conjunctive events bias makes us underestimate the effort required to accomplish complex plans. Most plans don’t work out. Things almost always take longer than expected. There are always delays due to dependencies. As Max Bazerman and Don Moore explain in Judgment in Managerial Decision Making, “The overestimation of conjunctive events offers a powerful explanation for the problems that typically occur with projects that require multistage planning. Individuals, businesses, and governments frequently fall victim to the conjunctive events bias in terms of timing and budgets. Home remodeling, new product ventures, and public works projects seldom finish on time.”

Plans don’t work because completing a sequence of tasks requires a great deal of cooperation from multiple events. As a system becomes increasingly complex, the chance of failure increases. A plan can be thought of as a system. Thus, a change in one component will very likely have impacts on the functionality of other parts of the system. The more components you have, the more chances that something will go wrong in one of them, causing delays, setbacks, and fails in the rest of the system. Even if the chance of an individual component failing is slight, a large number of them will increase the probability of failure.

Imagine you’re building a house. Things start off well. The existing structure comes down on schedule. Construction continues and the framing goes up, and you are excited to see the progress. The contractor reassures you that all trades and materials are lined up and ready to go. What is more likely:

  • The building permits get delayed
  • The building permits get delayed and the electrical goes in on schedule

You know a bit about the electrical schedule. You know nothing about the permits. But you bucket them in optimistically, erroneously linking one with the other. So you don’t worry about the building permits and never imagine that their delay will impact the electrical. When the permits do get delayed you have to pay the electrician for the week he can’t work, and then have to wait for him to finish another job before he can resume yours.

Thus, the more steps involved in a plan, the greater the chance of failure, as we associate probabilities to events that aren’t at all related. That is especially true as more people get involved, bringing their individual biases and misconceptions of chance.

In Seeking Wisdom: From Darwin to Munger, Peter Bevelin writes:

A project is composed of a series of steps where all must be achieved for success. Each individual step has some probability of failure. We often underestimate the large number of things that may happen in the future or all opportunities for failure that may cause a project to go wrong. Humans make mistakes, equipment fails, technologies don’t work as planned, unrealistic expectations, biases including sunk cost-syndrome, inexperience, wrong incentives, changing requirements, random events, ignoring early warning signals are reasons for delays, cost overruns, and mistakes. Often we focus too much on the specific base project case and ignore what normally happens in similar situations (base rate frequency of outcomes—personal and others). Why should some project be any different from the long-term record of similar ones? George Bernard Shaw said: “We learn from history that man can never learn anything from history.”

The more independent steps that are involved in achieving a scenario, the more opportunities for failure and the less likely it is that the scenario will happen. We often underestimate the number of steps, people, and decisions involved.

We can’t pretend that knowing about conjunctive events bias will automatically stop us from having it. When, however, we are doing planning where a successful outcome is of importance to us, it’s useful to run through our assumptions with this bias in mind. Sometimes, assigning frequencies instead of probabilities can also show us where our assumptions might be leading us astray. In the housing example above, asking what is the frequency of having building permits delayed in every hundred houses, versus the frequency of having permits delayed and electrical going in on time for the same hundred demonstrates more easily the higher frequency of option one.

It also extremely useful to keep a decision journal for our major decisions, so that we can more realistic in our estimates on the time and resources we need for future plans. The more realistic we are, the higher our chances of accomplishing what we set out to do.

The conjunctive events bias teaches us to be more pessimistic about plans and to consider the worst-case scenario, not just the best. We may assume things will always run smoothly but disruption is the rule rather than the exception.

Human Misjudgment and the American Revolution

We try to look at mental models in history through the lens of people who got it right, but once in a while, it’s beneficial to examine a model through the lens of those who got it wrong.

In this case, let’s take a look at the remarkable series of misjudgments that resulted in the British losing their American colonies.

Our list of mental models includes 24 models in the human nature + judgment category, and at least seven of those were a factor in the British being driven out of America. Sometimes it helps to understand how great the consequences of these very human tendencies can be. And, perhaps more significantly, how a large group of people can succumb to them at the same time.

Bias from Incentives

Money, the root of stupidity.

In the mid-18th century, the British had a parliament, but it was very different from what exists today. As Barbara W. Tuchman describes in The March of Folly, the House of Commons was made up mostly of second sons of the nobility – the landowning class. Urban centers such as London were poorly represented, and not surprisingly Parliament tended to pass laws that were primarily good for its members.

A lot of the issues which ultimately led to the revolution were about money.

The British wanted to tax the colonies, as Tuchman explains, so they would at least pay for their own defense, which was costly. The colonists felt that, with the exception of trade tariffs, the British had no right to tax those who were not represented in Parliament.

So part of the reason Parliament passed incendiary legislation, taxing, for example, stamps and tea, was so that the members of parliament, the landowners, could pay less tax. This was short-sighted — an incentive that could never be realized. As Tuchman describes, some more thoughtful dissenters pointed out that the cost of collecting the taxes from the hostile colonists was more than what the taxes would bring in.

Tendency to distort due to disliking/hating

We have written before that “Our inability to examine the situation from all sides and shake our beliefs, together with self-justifying behavior, can lead us to conclude that others are the problem. Such asymmetric views, amplified by strong perceived differences, often fuel hate.”

One of the things that Tuchman points out a few times is the complete ignorance of the British when it came to the sensibilities and interests of the Americans. And we can’t blame this on the distance or comparative slow speed of communication. Tuchman highlights what is most startling is those in positions of power in the Parliament literally had no desire to understand the colonists’ position. “That the British were invincibly uninformed – and stayed uninformed – about the people they insisted on ruling was a major problem of the imperial-colonial relationship.”

Parliament did not seek the advice or opinion of those Brits who had spent time in the colonies as Administrators, nor did it interview the well-educated and thoughtful Americans who were in London, such as Benjamin Franklin.

Due to their own sense of superiority, the British nobility believing they were the pinnacle of humanity, allowed their dislike of the colonists to distort the policies they pursued. (Remember history doesn’t repeat but it rhymes.) As Tuchman writes, “Attitude was again the obstacle; the English could not visualize Americans in terms of equality.”

You certainly don’t declare war on people you admire and respect.

Denial Tendency

To stubbornly pursue a course of action in the face of evidence that it will eventually blow up in your face is denial. We all do it, but to do it as a political group can lose you a war.

The American revolution did not start without warning. There were years of attempts by the British to assert control over the colonies. As Tuchman describes, they would institute taxes then rescind them, only to reinstate them later. The colonists had the same response every time. They rejected the ability of the British to tax them. It was total denial that kept the British trying.

The British passed a series of acts, called the Coercive Acts that seemed designed to piss off the Americans. But in reality, it was more about the total inability of the British to see the situation clearly. Tuchman says, “if Britain had really been pursuing a plan to goad the colonies to insurrection in order to subjugate them, then her conduct of policy becomes rational. Unhappily for reason, that version cannot be reconciled to the repeals, the backings and fillings, the haphazard or individual decisions.”

As we mentioned earlier, the cost of bringing in the tax was more than the tax itself. And if taxation was the issue that was driving the colonies to war, then why keep doing it? Denial is likely part of the answer.

Social Proof

“When we feel uncertain, we all tend to look to others for answers as to how we should behave, what we should think and what we should do.”

This is social proof.

The House of Commons was not a homogenized unit; there were dissenters to the British approach in the American colonies, though these voices were always in the minority. Some people argued against the taxes and the war, offering alternatives to Parliament to act in the interest of keeping the colonies part of the empire. But the majority followed their peers.

Added to this was the fact that, as Tuchman describes, the situation in America wasn’t a hot issue for most British. The nobility of the House of Commons was frequently more occupied with the various social scandals that occurred in their ranks.

What this helped to create was a situation of largely uninformed people responsible for voting on legislation that could have significant impacts. It is a human tendency to look to the majority for guidance on behavior when we are unsure about what to do. It is always easier to go with the majority than to oppose it. In the House of Commons, it was easier to vote with the majority than to take a stand against it, particularly if one wasn’t all that interested in the issue.

First-Conclusion Bias

We tend to stick with the first conclusion we reach. Because of our commitment to our own narrative, it becomes very hard for us to change our minds once we form a definite opinion. This involves us admitting we made a mistake — something we avoid, as it can challenge our very sense of self.

The core issue that started the conflict between Britain and the American colonies, which eventually led to the war, was, as Tuchman describes, the absolute conviction of the British that they had a right to directly tax the colonies, and the equal conviction of the American that no right existed.

At the beginning, the Americans did attempt some compromise. The British, however, never did.

Despite the dissent, the cost, and the effects, the British never reexamined their first conclusion. It became layered with other issues but remained at the core of their position. Tuchman demonstrates that “they persisted in first pursuing, then fighting for an aim whose result would be harmful whether they won or lost.”

Their first conclusion, the right of the British state to tax the American colonies, was never abandoned or modified in light of what enforcing it would actually result in. Even if it were true, the absolute nature of their position prevented them from finding a compromise. This bias was a contributing factor in the result the British finally had to accept. The loss of the war.

Commitment and Consistency Bias

Partnered with the first conclusion bias, this one essentially reinforces the pain. This is what causes us to “stick with our original decision, even in the face of new information.”

Although consistency is generally perceived to be good, uncompromising consistency is more synonymous with ignorance and fear. If torpedos are aimed at your boat, your crew might appreciate you turning it around, giving yourself time to regroup.

The British made attempts to solve their problems, but these were halfhearted at best. Tuchman actually depicts the British policy as not being consistent at all. The levied taxes, then they repealed them. They eventually sent a peace delegation but gave it no power to actually come to a compromise.

But they were fully committed to their overall attitude, which was, as Tuchman writes, “a sense of superiority so dense as to be impenetrable. A feeling of this kind leads to ignorance of the world and of others because it suppresses curiosity. [All] ministries went through a full decade of mounting conflict with the colonies without any of them sending a representative, much less a minister, across the Atlantic to make acquaintance, to discuss, to find out what was spoiling, even endangering, the relationship and how it might be better managed. They were not interested in Americans because they considered them rabble or at best children whom it was inconceivable to treat – or even fight – as equals.”

Given that this attitude of superiority was so entrenched, is it any wonder that the decisions made were those that reinforced this image?

Tendency to Want to Do Something

Busyness signals productivity. The faster you are walking the more important you are. Having time on your hands means you aren’t doing enough, not seizing the day, not contributing anything of value. Slow walkers are assumed to be seniors, students, or those who have nothing going on.

We can see the same trends in governments. Strong governments defend their position at all costs, while those who value negotiating or finding common ground are perceived as weaker. Powerful governments go to war. Those with less power find a compromise.

Tuchman claims, “Confronted by menace, or what is perceived as menace, governments will usually attempt to smash it, rarely to examine it, understand it, define it.”

So many times during the decade of conflict between the British and the Americans, the British might have put themselves in a better position if they had been willing to pause, regroup, or even walk away. Given some space, they might have compensated for the load of biases they were operating under and better defined and focused on a win-win solution.

But all the misjudgments flying around, combined with the innate human tendency to do something, led to chasing bad decisions with even worse ones.

If there is a silver lining, it’s that we can learn from our mistakes so as to not be perpetual victims of our misjudgment tendencies.

Tuchman concludes that the British did learn from their experiences during the American Revolution.

“Fifty years later, after a period of troubled relations with Canada, Commonwealth status began to emerge from the Durham Report, which resulted from England’s recognition that any other course would lead to a repetition of the American rebellion.”