Tag: Confirmation bias

A Wandering Mind: How Travel Can Change the Way You Think

Most people travel as an observer, and as a result, “see” a lot. When you travel as an active participant, the experience can transform the way you think, and how you see the world.

***

Here’s a situation familiar to many of us: We decide to take a vacation and go somewhere exotic. We plan the trip and mark our calendars, and as the date gets closer we get increasingly excited. Before we step on the plane, the possibilities seem endless. Anything could happen! Accidental encounters and adventures could change our lives!

We go. We have a good time. We see what we wanted to and enjoy the break from work. Upon returning home, we share the pictures and recount some of our experiences with friends. We give away the souvenirs. We step back into our lives. The glow fades and we settle to planning the next round of travel in our daydreams.

In the end, it’s a little sad. That incredible experience becomes like a mirage or a dream—similar to watching a movie, but a lot more expensive.

What if it doesn’t have to be like this?

Travel without participation and reflection is entertainment. Try to notice yourself in the journey, and capture the experience and insights when you interact with all the new things you are confronted with. You can get more out of your travel by using mental models to weave yourself into the experience, and come away enriched as well as entertained and rested.

First, inspiration from the past …

Just over 200 years ago, Mary Wollstonecraft, philosopher, feminist, and author of A Vindication of the Rights of Woman, was going through an emotionally difficult period. Her lover—the father of her child—wasn’t interested in being with her anymore. She was devastated and frustrated. As a philosopher, she believed it was important to live according to the ideals she espoused. The realities facing a middle-class woman in 18th-century England made that very hard. Women had essentially no rights. Having a child out of wedlock might have supported her ideas regarding how oppressive the institution of marriage was for women, but without the support of the child’s father, she knew she would struggle financially and socially. It was one of the lowest points of her life.

Wollstonecraft went to Scandinavia, mostly to recover some money for her lover and thus try to win him back. In this she failed. But she captured her journey in Letters Written During a Short Residence in Sweden, Norway, and Denmark. In doing so, she revolutionized travel writing and healed herself.

The Letters offer remarkable insight into Wollstonecraft’s lively mind. As she moves through the unfamiliar surroundings of three foreign countries, she asks herself questions and explores the ideas brought to mind. Observing the agricultural development of Norway, in many ways behind that of England at the time, she asks, “And, considering the question of human happiness, where, oh where does it reside? Has it taken up its abode with unconscious ignorance or with the high-wrought mind?”

She learns why the locals are nervous about serving coffee and how different their fashions are. She comments on the different gardening practices and the beauty of the trees. In contemplating how the Norwegians organize their social hierarchy she makes comparisons to England and infers conclusions about her native country—namely that the way things are is not necessarily how they have to be.

Most importantly, she records what effect the traveling has on her. “When a warm heart has received strong impressions, they are not to be effaced. Emotions become sentiments, and the imagination renders even transient sensations permanent by fondly retracing them. I cannot, without a thrill of delight, recollect views I have seen, which are not to be forgotten, nor looks I have felt in every nerve, which I shall never more meet.”

Here are some goals we can construct from Wollstonecraft’s approach to travel:

  1. Try to actively know the place you are in. Observe the customs. Interact with the locals.
  2. Learn the whys behind the observation. Explore the history. Ask questions. Try to understand the answers in relation to what you are experiencing now, setting aside any previous assumptions.
  3. Notice how the journey is affecting you. What memories surface? What new insights do you have? Are your opinions and beliefs challenged?
  4. Don’t plan out every detail. Explore. The map is not the territory.

So how do we put those goals into practice?

Here is where mental models can amplify the travel experience.

We all have a tendency to generalize from small samples. Our own little world becomes, without the infusion of new experiences, our frame for understanding the entire world. Travel broadens your sample set. You start to really understand the universals of the human condition versus the particulars of the area you occupy.

Travel is a great way to counter confirmation bias. Chances are, people in a different country will think differently than you. Interactions won’t reinforce your feedback loop. You will be exposed to new ideas and ways of approaching life that can remind you of the options you have when you go back home.

You can apply the power of algebraic equivalence. In algebra, as we solve abstractions such as x + y = 8, we learn that values can be equal without looking exactly the same. When you explore other cultures and ways of living, you see that there are many definitions of a good life and many ways to be happy. You begin to understand that equality of experience is different from sameness of experience. Not everyone wants what you want. This diversity in how we manifest our goals and desires accounts for differences in everything from personal philosophy to product markets.

The distance from your regular life can give you perspective. Using the terms of Galilean relativity, you get to be the fish instead of the scientist. The lens of travel can help you untangle problems back at home in many ways. The distance, both physical and psychological, also gives you the opportunity to observe yourself in your regular life without the day-to-day pressures clouding your judgment.

Try these specific tips to apply this mental models approach to travel:

  1. Keep a travel journal. It doesn’t have to be complicated. Travel is full of idle moments like waiting for transportation, or museum-feet recovery at the end of the day. Reflect and capture.
  2. Encourage serendipity in your experiences. Give yourself the chance to experience the unexpected. Over-planning reinforces your current biases. You can’t possibly know the best of a place before you get there.
  3. Be deliberate in setting your goal. Go somewhere with the intent of gaining something out of that experience. Don’t try to recreate your life at home, with the same restaurants and television shows.
  4. Be open to growth. Travel is an opportunity to choose to be different. Anticipate that you might add to the construct that is “you” when you travel. Embrace the additions to your identity so that you have new resources to draw on.

Through considering mental models and staying actively engaged, travel can jolt you awake, and show you the world in a different light.

The Narratives of History: Applying Lessons from the Past

“History is written by the winners” is the popular view. But your winner may not be my winner. A lot depends on the narrative you are trying to build.

History is rewritten all the time.

Sometimes it is rewritten because new information has come to light, perhaps from an archeological find or previously classified documents. When this happens, it is exciting. We joyfully anticipate that more information will deepen our understanding.

But rewriting frequently happens in the service of building a cultural or national narrative. We highlight bits of the past that support our perceived identities and willfully ignore the details that don’t fit. We like our history uncomplicated. It’s hard for us to understand our groups or our countries, and by extension ourselves, as both good and not-good at the same time.

Culture is collective memory. It’s the interwoven stories that we use to explain who we are as nations, organizations, or just loosely formed groups.

Many of us belong to multiple cultural groups, but only one national group. Margaret MacMillan, in The Uses and Abuses of History, explains that “Collective memory is more about the present than the past because it is integral to how the group sees itself.” And “while collective memory is usually grounded in fact, it need not be.”

We have seen how people justify all kinds of mistakes to preserve the personal narratives they are invested in, and groups also engage in this behavior. Countries rewrite their histories, from the textbook up, to support how they see themselves now. Instinctively we may recoil from this idea, believing that it’s better to turn over all the rocks and confront what is lurking underneath. However, as MacMillan writes, “It can be dangerous to question the stories people tell about themselves because so much of our identity is both shaped by and bound up with our history. That is why dealing with the past, in deciding on which version we want, or on what we want to remember and to forget, can become so politically charged.”

For example, when Canada’s new war museum opened, controversy immediately ensued because part of the World War II exhibit called attention “to the continuing debate over both the efficacy and the morality of the strategy of the Royal Air Force’s bomber command, which sought to destroy Germany’s capacity to fight on by massive bombing of German industrial and civilian targets.” RAF veterans were outraged that their actions were considered morally ambiguous. Critics of the exhibit charged that the veterans should have the final say because, after all, “they were there.”

We can see that this rationale makes no sense. Galilean relativity shows that the pilots who flew the bombing campaigns are actually the least likely to have an objective understanding of the events. And the ends don’t always justify the means. It is possible to do bad things in the pursuit of morally justified outcomes.

MacMillan warns that the danger of abusing history is that it “flattens out the complexity of human experience and leaves no room for different interpretations of the past.”

Which leaves us asking, What do we want from history? Do we want to learn from it, with the hopes that in doing so we will avoid mistakes by understanding the experiences of others? Or do we want to practice self-justification on a national level, reinforcing what we already believe about ourselves in order to justify what we did and what we are doing? After all, “you could almost always find a basis for your claims in the past if you looked hard enough.”

As with medicine, there is a certain fallibility to history. Our propensity to fool ourselves with self-justified narratives is hard to overcome. If we selectively use the past only to reinforce our claims in the present, then the situation becomes precarious when there is pressure to change. Instead of looking as objectively as possible at history, welcoming historians who challenge us, we succumb to confirmation bias, allowing only those interpretations that are consistent with the narrative we are invested in.

Consider what MacMillan writes about nationalism, which “is a very late development indeed in terms of human history.”

It all started so quietly in the nineteenth century. Scholars worked on languages, classifying them into different families and trying to determine how far back into history they went. They discovered rules to explain changes in language and were able to establish, at least to their own satisfaction, that texts centuries old were written in early forms of, for example, German or French. Ethnographers like the Grimm brothers collected German folk tales as a way of showing that there was something called the German nation in the Middle Ages. Historians worked assiduously to recover old stories and pieced together the history of what they chose to call their nation as though it had an unbroken existence since antiquity. Archaeologists claimed to have found evidence that showed where such nations had once lived, and where they had moved to during the great waves of migrations.

The cumulative result was to create an unreal yet influential version of how nations formed. While it could not be denied that different peoples, from Goths to Slavs, had moved into and across Europe, mingling as they did so with peoples already there, such a view assumed that at some point, generally in the Middle Ages, the music had stopped. The dancing pieces had fallen into their chairs, one for the French, another for the Germans and yet another for the Poles. And there history had fixed them as “nations.” German historians, for example, could depict an ancient German nation whose ancestors had lived happily in their forests from before the time of the Roman Empire and which at some time, probably in the first century A.D., had become recognisably “German.” So — and this was the dangerous question — what was properly the German nation’s land? Or the land of any other “nation”? Was it where the people now lived, where they had lived at the time of their emergence in history, or both?

Would the scholars have gone on with their speculations if they could have seen what they were preparing the way for? The bloody wars that created Italy and Germany? The passions and hatred that tore apart the old multinational Austria-Hungary? The claims, on historical grounds, by new and old nations after World War I for the same pieces of territory? The hideous regimes of Hitler and Mussolini with their elevation of the nation and the race to the supreme good and their breathtaking demands for the lands of others?

When we selectively reach back into the past to justify claims in the present, we reduce the complexity of history and of humanity. This puts us in an awkward position because the situations we are confronted with are inherently complex. If we cut ourselves off from the full scope of history because it makes us uncomfortable, or doesn’t fit with the cultural narrative in which we live, we reduce our ability to learn from the past and apply those lessons to the situations we are facing today.

MacMillan says, “There are also many lessons and much advice offered by history, and it is easy to pick and choose what you want. The past can be used for almost anything you want to do in the present. We abuse it when we create lies about the past or write histories that show only one perspective. We can draw our lessons carefully or badly. That does not mean we should not look to history for understanding, support and help; it does mean that we should do so with care.”

We need to accept that people can do great things while still having flaws. Our heroes don’t have to be perfect, and we can learn just as much from their imperfections as from their achievements.

We have to allow that there are at least two sides to every story, and we have to be willing to listen to both. There are no conflicts in which one side doesn’t feel morally justified in their actions; that’s why your terrorist can be my freedom fighter. History can be an important part of bridging this divide only if we are willing to lift up all the rocks and shine our lights on what is lurking underneath.

Confirmation Bias And the Power of Disconfirming Evidence

Confirmation bias is our tendency to cherry-pick information that confirms our existing beliefs or ideas. Confirmation bias explains why two people with opposing views on a topic can see the same evidence and come away feeling validated by it. This cognitive bias is most pronounced in the case of ingrained, ideological, or emotionally charged views.

Failing to interpret information in an unbiased way can lead to serious misjudgments. By understanding this, we can learn to identify it in ourselves and others. We can be cautious of data that seems to immediately support our views.

Confirmation Bias: Why You Should Seek Out Disconfirming Evidence

When we feel as if others “cannot see sense,” a grasp of how confirmation bias works can enable us to understand why. Willard V. Quine and J.S. Ullian described this bias in The Web of Belief as such:

The desire to be right and the desire to have been right are two desires, and the sooner we separate them the better off we are. The desire to be right is the thirst for truth. On all counts, both practical and theoretical, there is nothing but good to be said for it. The desire to have been right, on the other hand, is the pride that goeth before a fall. It stands in the way of our seeing we were wrong, and thus blocks the progress of our knowledge.

Experimentation beginning in the 1960s revealed our tendency to confirm existing beliefs, rather than questioning them or seeking new ones. Other research has revealed our single-minded need to enforce ideas.

“What the human being is best at doing is interpreting all new information so that their prior conclusions remain intact.”

— Warren Buffett

Like many mental models, confirmation bias was first identified by the ancient Greeks. In The History of the Peloponnesian War, Thucydides described this tendency as such:

For it is a habit of humanity to entrust to careless hope what they long for, and to use sovereign reason to thrust aside what they do not fancy.

Our use of this cognitive shortcut is understandable. Evaluating evidence (especially when it is complicated or unclear) requires a great deal of mental energy. Our brains prefer to take shortcuts. This saves the time needed to make decisions, especially when we’re under pressure. As many evolutionary scientists have pointed out, our minds are unequipped to handle the modern world. For most of human history, people experienced very little new information during their lifetimes. Decisions tended to be survival based. Now, we are constantly receiving new information and have to make numerous complex choices each day. To stave off overwhelm, we have a natural tendency to take shortcuts.

In “The Case for Motivated Reasoning,” Ziva Kunda wrote, “we give special weight to information that allows us to come to the conclusion we want to reach.” Accepting information that confirms our beliefs is easy and requires little mental energy. Contradicting information causes us to shy away, grasping for a reason to discard it.

In The Little Book of Stupidity, Sia Mohajer wrote:

The confirmation bias is so fundamental to your development and your reality that you might not even realize it is happening. We look for evidence that supports our beliefs and opinions about the world but excludes those that run contrary to our own… In an attempt to simplify the world and make it conform to our expectations, we have been blessed with the gift of cognitive biases.

“The human understanding when it has once adopted an opinion, draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects and despises, or else by some distinction sets aside and rejects.”

— Francis Bacon

How Confirmation Bias Clouds Our Judgment

The complexity of confirmation bias arises partly from the fact that it is impossible to overcome it without an awareness of the concept. Even when shown evidence to contradict a biased view, we may still interpret it in a manner that reinforces our current perspective.

In one Stanford study, half of the participants were in favor of capital punishment, and the other half were opposed to it. Both groups read details of the same two fictional studies. Half of the participants were told that one study supported the deterrent effect of capital punishment, and the other opposed it. The other participants read the opposite information. No matter, the majority of participants stuck to their original views, pointing to the data that supported it and discarding that which did not.

Confirmation bias clouds our judgment. It gives us a skewed view of information, even when it consists only of numerical figures. Understanding this cannot fail to transform a person’s worldview — or rather, our perspective on it. Lewis Carroll stated, “we are what we believe we are,” but it seems that the world is also what we believe it to be.

A poem by Shannon L. Alder illustrates this concept:

Read it with sorrow and you will feel hate.
Read it with anger and you will feel vengeful.
Read it with paranoia and you will feel confusion.
Read it with empathy and you will feel compassion.
Read it with love and you will feel flattery.
Read it with hope and you will feel positive.
Read it with humor and you will feel joy.
Read it without bias and you will feel peace.
Do not read it at all and you will not feel a thing.

Confirmation bias is somewhat linked to our memories (similar to availability bias). We have a penchant for recalling evidence that backs up our beliefs. However neutral the original information was, we fall prey to selective recall. As Leo Tolstoy wrote:

The most difficult subjects can be explained to the most slow-witted man if he has not formed any idea of them already; but the simplest thing cannot be made clear to the most intelligent man if he is firmly persuaded that he knows already, without a shadow of doubt, what is laid before him.

“Beliefs can survive potent logical or empirical challenges. They can survive and even be bolstered by evidence that most uncommitted observers would agree logically demands some weakening of such beliefs. They can even survive the destruction of their original evidential bases.”

— Lee Ross and Craig Anderson

Why We Ignore Contradicting Evidence

Why is it that we struggle to even acknowledge information that contradicts our views?

When first learning about the existence of confirmation bias, many people deny that they are affected. After all, most of us see ourselves as intelligent, rational people. So, how can our beliefs persevere even in the face of clear empirical evidence? Even when something is proven untrue, many entirely sane people continue to find ways to mitigate the subsequent cognitive dissonance.

Much of this is the result of our need for cognitive consistency. We are bombarded by information. It comes from other people, the media, our experience, and various other sources. Our minds must find means of encoding, storing, and retrieving the data we are exposed to. One way we do this is by developing cognitive shortcuts and models. These can be either useful or unhelpful.

Confirmation bias is one of the less-helpful heuristics which exists as a result. The information that we interpret is influenced by existing beliefs, meaning we are more likely to recall it. As a consequence, we tend to see more evidence that enforces our worldview. Confirmatory data is taken seriously, while disconfirming data is treated with skepticism. Our general assimilation of information is subject to deep bias.

Constantly evaluating our worldview is exhausting, so we prefer to strengthen it instead. Plus holding different ideas in our head is hard work. It’s much easier to just focus on one.

We ignore contradictory evidence because it is so unpalatable for our brains. According to research by Jennifer Lerner and Philip Tetlock, we are motivated to think critically only when held accountable by others. If we are expected to justify our beliefs, feelings, and behaviors to others, we are less likely to be biased towards confirmatory evidence. This is less out of a desire to be accurate, and more the result of wanting to avoid negative consequences or derision for being illogical. Ignoring evidence can be beneficial, such as when we side with the beliefs of others to avoid social alienation.

Examples of Confirmation Bias in Action

Creationists vs. Evolutionary Biologists
A prime example of confirmation bias can be seen in the clashes between creationists and evolutionary biologists. The latter use scientific evidence and experimentation to reveal the process of biological evolution over millions of years. The former see the Bible as being true in the literal sense and think the world is only a few thousand years old. Creationists are skilled at mitigating the cognitive dissonance caused by factual evidence that disproves their ideas. Many consider the non-empirical “evidence” for their beliefs (such as spiritual experiences and the existence of scripture) to be of greater value than the empirical evidence for evolution.

Evolutionary biologists have used fossil records to prove that the process of evolution has occurred over millions of years. Meanwhile, some creationists view the same fossils as planted by a god to test our beliefs. Others claim that fossils are proof of the global flood described in the Bible. They ignore evidence to contradict these conspiratorial ideas and instead use it to confirm what they already think.

Doomsayers
Take a walk through London on a busy day, and you are pretty much guaranteed to see a doomsayer on a street corner, ranting about the upcoming apocalypse. Return a while later and you will find them still there, announcing that the end has been postponed.

In When Prophecy Fails, Leon Festinger explained the phenomenon this way:

Suppose an individual believes something with his whole heart; suppose further that he has a commitment to this belief, that he has taken irrevocable actions because of it; finally, suppose that he is presented with evidence, unequivocal and undeniable evidence, that his belief is wrong: what will happen? The individual will frequently emerge, not only unshaken but even more convinced of the truth of his beliefs than ever before. Indeed, he may even show a new fervor about convincing and converting people to his view.

Music

Confirmation bias in music is interesting because it is actually part of why we enjoy it so much. According to Daniel Levitin, author of This Is Your Brain on Music:

As music unfolds, the brain constantly updates its estimates of when new beats will occur, and takes satisfaction in matching a mental beat with a real-in-the-world one.

Witness the way a group of teenagers will act when someone puts on “Wonderwall” by Oasis or “Creep” by Radiohead. Or how their parents react to “Starman” by Bowie or “Alone” by Heart. Or even their grandparents to “The Way You Look Tonight” by Sinatra or “Non, Je ne Regrette Rien” by Edith Piaf. The ability to predict each successive beat or syllable is intrinsically pleasurable. This is a case of confirmation bias serving us well. We learn to understand musical patterns and conventions, enjoying seeing them play out.

Homeopathy
The multibillion-dollar homeopathy industry is an example of mass confirmation bias.

Homeopathy was invented by Jacques Benveniste, a French researcher studying histamines. Benveniste became convinced that as a solution of histamines was diluted, the effectiveness increased due to what he termed “water memories.” Test results were performed without blinding, leading to a placebo effect.

Benveniste was so certain of his hypothesis that he found data to confirm it and ignored that which did not. Other researchers repeated his experiments with appropriate blinding and proved Benveniste’s results to have been false. Many of the people who worked with him withdrew from science as a result.

Yet homeopathy supporters have only grown in numbers. Supporters cling to any evidence to support homeopathy while ignoring that which does not.

“One of the biggest problems with the world today is that we have large groups of people who will accept whatever they hear on the grapevine, just because it suits their worldview—not because it is actually true or because they have evidence to support it. The striking thing is that it would not take much effort to establish validity in most of these cases… but people prefer reassurance to research.”

— Neil deGrasse Tyson

Scientific Experiments
In good scientific experiments, researchers should seek to falsify their hypotheses, not to confirm them. Unfortunately, this is not always the case (as shown by homeopathy). There are many cases of scientists interpreting data in a biased manner, or repeating experiments until they achieve the desired result. Confirmation bias also comes into play when scientists peer-review studies. They tend to give positive reviews of studies that confirm their views and of studies accepted by the scientific community.

This is problematic. Inadequate research programs can continue past the point where the evidence points to a false hypothesis. Confirmation bias wastes a huge amount of time and funding. We must not take science at face value and must be aware of the role of biased reporting.

“The eye sees only what the mind is prepared to comprehend.”

— Robertson Davies

Conclusion

This article can provide an opportunity for you to assess how confirmation bias affects you. Consider looking back over the previous paragraphs and asking:

  • Which parts did I automatically agree with?
  • Which parts did I ignore or skim over without realizing?
  • How did I react to the points which I agreed or disagreed with?
  • Did this post confirm any ideas I already had? Why?
  • What if I thought the opposite of those ideas?

Being cognizant of confirmation is not easy, but with practice, it is possible to recognize the role it plays in the way we interpret information. You need to search out disconfirming evidence.

As Rebecca Goldstein wrote in Incompleteness: The Proof and Paradox of Kurt Godel:

All truths — even those that had seemed so certain as to be immune to the very possibility of revision — are essentially manufactured. Indeed, the very notion of the objectively true is a socially constructed myth. Our knowing minds are not embedded in truth. Rather, the entire notion of truth is embedded in our minds, which are themselves the unwitting lackeys of organizational forms of influence.

To learn more about confirmation bias, read The Little Book of Stupidity or The Black Swan. Be sure to check out our entire latticework of mental models.

Nassim Taleb: How to Not be a Sucker From the Past

“History is useful for the thrill of knowing the past, and for the narrative (indeed), provided it remains a harmless narrative.”

— Nassim Taleb

The fact that new information exists about the past in general means that we have an incomplete roadmap about history. There is a necessary fallibility … if you will.

In The Black Swan, Nassim Taleb writes:

History is useful for the thrill of knowing the past, and for the narrative (indeed), provided it remains a harmless narrative. One should learn under severe caution. History is certainly not a place to theorize or derive general knowledge, nor is it meant to help in the future, without some caution. We can get negative confirmation from history, which is invaluable, but we get plenty of illusions of knowledge along with it.

While I don’t entirely hold Taleb’s view, I think it’s worth reflecting on. As a friend put it to me recently, “when people are looking into the rearview mirror of the past, they can take facts and like a string of pearls draw lines of causal relationships that facilitate their argument while ignoring disconfirming facts that detract from their central argument or point of view.”

Taleb advises us to adopt the empirical skeptic approach of Menodotus which was to “know history without theorizing from it,” and to not draw any large theoretical or scientific claims.

We can learn from history but our desire for causality can easily lead us down a dangerous rabbit hole when new facts come to light disavowing what we held to be true. In trying to reduce the cognitive dissonance, our confirmation bias leads us to reinterpret past events in a way that fits our current beliefs.

History is not stagnant — we only know what we know currently and what we do know is subject to change. The accepted beliefs about how events played out may change in light of new information and then the newly accepted beliefs may change over time as well.

Falsification: How to Destroy Incorrect Ideas

Sir Karl Popper wrote that the nature of scientific thought is that we could never be sure of anything. The only way to test the validity of any theory was to prove it wrong, a process he labeled falsification. And it turns out we’re quite bad at falsification.

When it comes to testing a theory we don’t instinctively try to find evidence we’re wrong. It’s much easier and more mentally satisfying to find information that proves our intuition. This is known as the confirmation bias.

“The human mind is a lot like the human egg, and the human egg has a shut-off device. When one sperm gets in, it shuts down so the next one can’t get in.”

— Charlie Munger

In Paul Tough’s book, How Children Succeed: Grit, Curiosity, and the Hidden Power of Character, he tells the story of an English psychologist Peter Cathcart Wason, who came up with an “ingenious experiment to demonstrate our natural tendency to confirm rather than disprove our own ideas.”

Subjects were told that they would be given a series of three numbers that followed a certain rule known only to the experimenter. Their assignment was to figure out what the rule was, which they could do by offering the experimenter other strings of three numbers and asking him whether or not these new strings met the rule.

The string of numbers the subjects were given was quite simple:

2-4-6

Try it: What’s your first instinct about the rule governing these numbers? And what’s another string you might test with the experimenter in order to find out if your guess is right? If you’re like most people, your first instinct is that the rule is “ascending even numbers” or “numbers increasing by two.” And so you guess something like:

8-10-12

And the experimenter says, “Yes! That string of numbers also meets the rule.” And your confidence rises. To confirm your brilliance, you test one more possibility, just as due diligence, something like:

20-22-24

“Yes!” says the experimenter. Another surge of dopamine. And you proudly make your guess: “The rule is: even numbers, ascending in twos.” “No!” says the experimenter. It turns out that the rule is “any ascending numbers.” So 8-10-12 does fit the rule, it’s true, but so does 1-2-3. Or 4-23-512. The only way to win the game is to guess strings of numbers that would prove your beloved hypothesis wrong—and that is something each of us is constitutionally driven to avoid.

In the study, only 1 in five people was able to guess the correct rule.

And the reason we’re all so bad at games like this is the tendency toward confirmation bias: It feels much better to find evidence that confirms what you believe to be true than to find evidence that falsifies what you believe to be true. Why go out in search of disappointment?

There is also a video explaining Wason’s work.

The Four Villains of Decision Making

You’re probably not as effective at making decisions as you could be.

This article explores Chip and Dan Heaths’ new book, Decisive. It’s going to help us make better decisions both as individuals and in groups.

But before we get to that, you should think about a tough decision you’re grappling with right now. Having a decision working in your mind as you’re reading this post will help make the advice in here tangible.

Ok, let’s dig in.

“A remarkable aspect of your mental life is that you are rarely stumped … The normal state of your mind is that you have intuitive feelings and opinions about almost everything that comes your way. You like or dislike people long before you know much about them; you trust or distrust strangers without knowing why; you feel that an enterprise is bound to succeed without analyzing it.”

— Daniel Kahneman

We’re quick to jump to conclusions because we give too much weight to the information in front of us and we fail to search for new information, which might disprove our thoughts.

Nobel Prize winning Psychologist Daniel Kahneman called this tendency “what you see is all there is.” But that’s not the only reason we don’t make good decisions — there are many others.

We’re overconfident. We look for information that fits our thoughts and ignore information that doesn’t. We are overly influenced by authority. We choose the short-term over the long-term. Once we’ve made a decision we find it hard to change our mind. In short, our brains are flawed. I could go on.

Knowing about these and other biases isn’t enough; it doesn’t help us fix the problem. We need a framework for making decisions.

In Decisive, the Heaths introduce a four-step process designed to counteract many biases.

In keeping with Kahneman’s visual metaphor, the Heaths refer to the tendency to see only what’s in front of us as a “spotlight” effect.

And that, in essence, is the core difficulty of decision making.

What’s in the spotlight will rarely be everything we need to make good decisions, but we won’t always remember to shift the light.

Most of us rarely use a process for thinking about things. If we do use one it’s likely to be the pros-and-cons list. While better than nothing, this approach is still deeply flawed because it doesn’t really account for biases.

The Four Villains of Decision Making

  1. Narrow Framing: “… the tendency to define our choices too narrowly, to see them in binary terms. We ask, “Should I break up with my partner or not?” instead of “What are the ways I could make this relationship better?”
  2. Confirmation Bias: “When people have the opportunity to collect information from the world, they are more likely to select information that supports their preexisting attitudes, beliefs, and actions.” We pretend we want the truth, yet all we really want is reassurance.
  3. Short-term Emotion: “When we’ve got a difficult decision to make, our feelings churn. We replay the same arguments in our head. We agonize about our circumstances. We change our minds from day to day. If our decision was represented on a spreadsheet, none of the numbers would be changing—there’s no new information being added—but it doesn’t feel that way in our heads.”
  4. Overconfidence: “People think they know more than they do about how the future will unfold.”

The Heaths came up with a process to help us overcome these villains and make better choices. “We can’t deactivate our biases, but … we can counteract them with the right discipline.” The nature of each of the four decision-making villains suggests a strategy for how to defeat it.

1. You encounter a choice. But narrow framing makes you miss options. So … Widen Your Options. How can you expand your set of choices? …

2. You analyze your options. But the confirmation bias leads you to gather self-serving information. So … Reality-Test Your Assumptions. How can you get outside your head and collect information you can trust? …

3. You make a choice. But short-term emotion will often tempt you to make the wrong one. So … Attain Distance Before Deciding. How can you overcome short-term emotion and conflicted feelings to make better choices? …

4. Then you live with it. But you’ll often be overconfident about how the future will unfold. So … Prepare to Be Wrong. How can we plan for an uncertain future so that we give our decisions the best chance to succeed? …

They call this WRAP. “At its core, the WRAP model urges you to switch from “auto spotlight” to manual spotlight.

WRAP

All in all this was a great book. We focus our efforts on analysis. If a decision is wrong the analysis must have been the problem. Not only does this ignore the fact that you can have ‘bad outcomes’ with good decisions but it also places your spotlight on the analysis at the cost of the process by which the decision was made.

Read this next: What Matters More in Decisions: Analysis or Process?