Tag: Over-confidence Bias

The Four Villains of Decision Making

You’re probably not as effective at making decisions as you could be.

This article explores Chip and Dan Heaths’ new book, Decisive. It’s going to help us make better decisions both as individuals and in groups.

But before we get to that, you should think about a tough decision you’re grappling with right now. Having a decision working in your mind as you’re reading this post will help make the advice in here tangible.

Ok, let’s dig in.

“A remarkable aspect of your mental life is that you are rarely stumped … The normal state of your mind is that you have intuitive feelings and opinions about almost everything that comes your way. You like or dislike people long before you know much about them; you trust or distrust strangers without knowing why; you feel that an enterprise is bound to succeed without analyzing it.”

— Daniel Kahneman

We’re quick to jump to conclusions because we give too much weight to the information in front of us and we fail to search for new information, which might disprove our thoughts.

Nobel Prize winning Psychologist Daniel Kahneman called this tendency “what you see is all there is.” But that’s not the only reason we don’t make good decisions — there are many others.

We’re overconfident. We look for information that fits our thoughts and ignore information that doesn’t. We are overly influenced by authority. We choose the short-term over the long-term. Once we’ve made a decision we find it hard to change our mind. In short, our brains are flawed. I could go on.

Knowing about these and other biases isn’t enough; it doesn’t help us fix the problem. We need a framework for making decisions.

In Decisive, the Heaths introduce a four-step process designed to counteract many biases.

In keeping with Kahneman’s visual metaphor, the Heaths refer to the tendency to see only what’s in front of us as a “spotlight” effect.

And that, in essence, is the core difficulty of decision making.

What’s in the spotlight will rarely be everything we need to make good decisions, but we won’t always remember to shift the light.

Most of us rarely use a process for thinking about things. If we do use one it’s likely to be the pros-and-cons list. While better than nothing, this approach is still deeply flawed because it doesn’t really account for biases.

The Four Villains of Decision Making

  1. Narrow Framing: “… the tendency to define our choices too narrowly, to see them in binary terms. We ask, “Should I break up with my partner or not?” instead of “What are the ways I could make this relationship better?”
  2. Confirmation Bias: “When people have the opportunity to collect information from the world, they are more likely to select information that supports their preexisting attitudes, beliefs, and actions.” We pretend we want the truth, yet all we really want is reassurance.
  3. Short-term Emotion: “When we’ve got a difficult decision to make, our feelings churn. We replay the same arguments in our head. We agonize about our circumstances. We change our minds from day to day. If our decision was represented on a spreadsheet, none of the numbers would be changing—there’s no new information being added—but it doesn’t feel that way in our heads.”
  4. Overconfidence: “People think they know more than they do about how the future will unfold.”

The Heaths came up with a process to help us overcome these villains and make better choices. “We can’t deactivate our biases, but … we can counteract them with the right discipline.” The nature of each of the four decision-making villains suggests a strategy for how to defeat it.

1. You encounter a choice. But narrow framing makes you miss options. So … Widen Your Options. How can you expand your set of choices? …

2. You analyze your options. But the confirmation bias leads you to gather self-serving information. So … Reality-Test Your Assumptions. How can you get outside your head and collect information you can trust? …

3. You make a choice. But short-term emotion will often tempt you to make the wrong one. So … Attain Distance Before Deciding. How can you overcome short-term emotion and conflicted feelings to make better choices? …

4. Then you live with it. But you’ll often be overconfident about how the future will unfold. So … Prepare to Be Wrong. How can we plan for an uncertain future so that we give our decisions the best chance to succeed? …

They call this WRAP. “At its core, the WRAP model urges you to switch from “auto spotlight” to manual spotlight.

WRAP

All in all this was a great book. We focus our efforts on analysis. If a decision is wrong the analysis must have been the problem. Not only does this ignore the fact that you can have ‘bad outcomes’ with good decisions but it also places your spotlight on the analysis at the cost of the process by which the decision was made.

Read this next: What Matters More in Decisions: Analysis or Process?

Why Bad Things Happen to Good Decisions

Good decisions don’t always have a good outcome, just as bad decisions don’t always have bad outcomes. Sometimes bad outcomes happen to good decisions. And sometimes good things happen to bad decisions. Learning to distinguish between when you’re brilliant and lucky is the key to rapid improvement.

When other people make decisions with bad outcomes, we tend to focus on the people behind the decision. We can’t seem to shake the belief that good people make good decisions and bad people make bad decisions. It’s easy to think that we would have made a better decision.

When our decisions have bad outcomes we know the outcomes is not all there is to see. Our thoughts and intentions come into play. We can’t see the thoughts and intentions of others—we only see their actions through a biased lens. We can, however, see our thoughts and intentions. Our bad outcomes happen because we were unlucky.

The Decision Outcome Matrix

Consider this simple two-by-two decision outcome matrix.

Two by two decision matrix with good and bad processes

We want to deserve success. Everyone wants to be in the upper left box —  a good process that results in a good outcome. The problem is the world doesn’t always comply with our wishes. Following a good process can lead to a bad outcome because of uncertainty.

A good process with a bad outcome requires that we remember nothing is for sure.

Bad outcomes from a bad process require a choice. If we are delusional and let our ego dominate, we mistakenly see this as bad luck. While we are aware that we had a negative outcome, we are unaware that it resulted from a bad process. In this case, we learn nothing.  We are doomed to repeat our mistakes. More self-reflective people see bad outcomes resulting from bad process as an opportunity to learn as much as we can to avoid repeat failures.

Perhaps the worst quadrant to be in it the lower left — a bad process leads to a good outcome. Ignoring the fact that this isn’t repeatable, we convince ourselves that we deserved success.

If you can’t recognize when you’ve had ‘dumb luck,’ you’ll never be in a position to correct the way you’re making decisions. Eventually, your luck runs out.

Of course it’s easier to place others into the matrix than ourselves. When it comes to others, we see more of reality because we’re a disconnected observer. When it comes to ourselves, however, we default to ego protection.

One of the ways to calibrate your decisions is to use a decision journal. A good decision is known before the outcome. It involves a mental representation of the facts known at the time as well as applied judgment. Good decisions are valuable but they are more valuable if they are part of a good decision process because a good process allows for feedback about where you can improve. This feedback, in turn, allows you to constantly get better at making decisions.

Daniel Kahneman — What I Know

Nobel prize-winning psychologist Daniel Kahneman talks with the Guardian about his pessimistic mother, the delusion of investment bankers and the need for irony.

Human beings cannot comprehend very large or very small numbers. It would be useful for us to acknowledge that fact.

My main work has concerned judgment and decision-making. But I never felt I was studying the stupidity of mankind in the third person. I always felt I was studying my own mistakes.

Happiness is complicated. There are two components. One is strongly genetic; the second is a question of how you feel at any moment. I am pretty content, but I had a very pessimistic mother, and I’ve always been known as a pessimist.

It was always assumed I would be a professor. I grew up thinking it.

There is a powerful idea that we should want to be richer. I went to a financial advisor in the States and said: “I don’t really want to get richer, but I would like to continue to live like I do.” She said: “I can’t work with you.”

Collaboration is not only more creative, it is more fun. Amos Tversky, my research partner, and I were better together than on our own. We sort of knew that. Mostly it was extremely pleasant not trying to work everything out yourself.

A sense of irony is essential. When we wrote our first paper, “The Law of Small Numbers”, we were laughing all the time we wrote it. A colleague we showed it to said: “This is going to change things.” I didn’t take him seriously.

Many people now say they knew a financial crisis was coming, but they didn’t really. After a crisis we tell ourselves we understand why it happened and maintain the illusion that the world is understandable. In fact, we should accept the world is incomprehensible much of the time.

Motives are rarely straightforward. When the war started my father was chief of research for a company that was part of L’Oréal in Paris. The owner of L’Oréal was also a main funder of the fascist party in France, and antisemitic. But he protected my [Jewish] father during the war when he was taken by the Nazis.

People who wouldn’t even come to your funeral seem to take simple pleasure in the fact that you have won the Nobel prize [for economic sciences], and it makes them feel good about themselves to feel that way. For a while you are spreading joy.

Investment bankers believe in what they do. They don’t want to hear that their decisions are no better than chance. The rest of us pay for their delusions.

Despite 45 years of work in the field, I am still inclined to make over-confident predictions.

Economists have a mystique among social scientists because they know mathematics. They are quite good at explaining what has happened after it has happened, but rarely before. I don’t think of myself as an economist at all.

I enjoy being active but I look forward to the day when I can retire to the internet.

Still curious? Read Thinking, Fast and Slow.

source

Economists Are Overconfident. So Are You.

The point of regression analysis is to make predictions based on past relationships. “My concern,” one of the authors of the paper said, “is that when reading economics journal articles you get the impression that the world is much more predictable than it is.”

What Soyer and Hogarth did was get 257 economists to read about a regression analysis that related independent variable X to dependent variable Y, then answer questions about the probabilities of various outcomes (example: if X is 1, what’s the probability of Y being greater than 0.936?).

When the results were presented in the way empirical results usually are presented in economics journals — as the average outcomes of the regression followed by a few error terms — the economists did a really bad job of answering the questions. They paid too much attention to the averages, and too little to the uncertainties inherent in them, thereby displaying too much confidence.

When the economists were shown the numerical results plus scatter graphs of the same data, they did slightly better. The economists who were shown only the graphs and none of the numerical results, meanwhile, actually got most of the answers right, or close to right.

The bigger point here, which Soyer and Hogarth have elaborated in other research, is that we tend to understand probabilistic information much better when it’s presented in visual form than if we’re just shown the numbers. (This was also a key argument of Sam Savage’s edifying and entertaining 2009 book The Flaw of Averages.) What’s so interesting is to learn that statistically literate experts are just as likely to glom onto the point estimate and discount the uncertainty as, say, innumerate journalists reporting the results of political polls.

Still curious? Read the paper and The Flaw of Averages.
source.

The Decision-Making Flaw in Powerful People

The paper below finds a link between having a sense of power and ignoring the advice of others.

The authors argue that power increases confidence, which can lead to an excessive belief in one’s own judgment.

In a sense, powerful people think they are right because of their place in the organization, not because of their knowledge.

This, of course, leads to flawed decisions.

From Strategy+Business:

Previous research has shown that the quality of decision making declines when people hew too much to their own beliefs and discount too readily the advice of others; outside information helps “average out” the distortions that can result when people give a great deal of weight to their own opinions and first impressions. This paper is among the first to examine whether power — defined as an individual’s “capacity to influence others, stemming in part from his or her control over resources, rewards, or punishments” — reduces or increases a person’s willingness to heed advice.

… In addition to confirming the previous experiments’ finding that more powerful people were less likely to take advice and were more likely to have high confidence in their answers, this final experiment showed that high-power participants were less accurate in their answers than low-power participants. By calculating the mean deviation between respondents’ initial estimates and the true answers, the researchers showed that low-power participants came significantly closer in their final estimates to the real tuition numbers because they “averaged” their initial guesses with the input from the advisors.

The researchers propose that their findings have troubling implications for organizations — and that power could negatively affect not just advice taking, but also an individual’s approach to seeking help or accepting performance feedback. But because power and confidence are so interrelated, there are ways to mitigate the problem. By “directly addressing the inflated confidence levels of powerful individuals,” the researchers write, “organizations may be able to help people with power take (and/or seek) advice when it is valuable to do so.”

For one thing, organizations could formally include advice gathering at the earliest stages of the decision-making process, before powerful individuals have a chance to form their own opinions. Encouraging leaders to refrain from commenting on decisions publicly could also keep them from feeling wedded to a particular point of view.

Bottom Line:
Powerful people are less likely to take advice from others, in large part because they have high confidence in their own judgment and don’t feel the need to incorporate outside views. By not factoring in others’ advice, however, people in power risk making flawed decisions.

Abstract:

Incorporating input from others can enhance decision quality, yet often people do not effectively utilize advice. We propose that greater power increases the propensity to discount advice, and that a key mechanism explaining this effect is elevated confidence in one’s judgment. We investigate the relationships across four studies: a field survey where working professionals rated their own power and confidence and were rated by coworkers on their level of advice taking; an advice taking task where power and confidence were self-reported; and two advice taking experiments where power was manipulated. Results consistently showed a negative relationship between power and advice taking, and evidence of mediation through confidence. The fourth study also revealed that higher power participants were less accurate in their final judgments. Power can thus exacerbate the tendency for people to overweight their own initial judgment, such that the most powerful decision makers can also be the least accurate.

Source: Kelly E. See, Elizabeth W. Morrison, Naomi B. Rothman, Jack B. Soll, The detrimental effects of power on confidence, advice taking, and accuracy, Organizational Behavior and Human Decision Processes

Future Babble: Why expert predictions fail and why we believe them anyway

Future Babble has come out to mixed reviews. I think the book would interest anyone seeking wisdom.

Here are some of my notes:

First a little background: Predictions fail because the world is too complicated to be predicted with accuracy and we’re wired to avoid uncertainty. However, we shouldn’t blindly believe experts. The world is divided into two: foxes and hedgehogs. The fox knows many things whereas the hedgehog knows one big thing. Foxes beat hedgehogs when it comes to making predictions.

  • What we should ask is, in a non-linear world, why would we think oil prices can be predicted. Practically since the dawn of the oil industry in the nineteenth century, experts have been forecasting the price of oil. They’ve been wrong ever since. And yet this dismal record hasn’t caused us to give up on the enterprise of forecasting oil prices. 
  • One of psychology’s fundamental insights, wrote psychologist Daniel Gilbert, is that judgements are generally the products of non-conscious systems that operate quickly, on the basis scant evidence, and in a routine manner, and then pass their hurried approximations to consciousness, which slowly and deliberately adjust them. … (one consequence of this is that) Appearance equals reality. In the ancient environment in which our brains evolved, that as a good rule, which is why it became hard-wired into the brain and remains there to this day. (an example of this) as psychologists have shown, people often stereotype “baby-faced” adults as innocent, helpless, and needy.
  • We have a hard time with randomness. If we try, we can understand it intellectually, but as countless experiments have shown, we don’t get it intuitively. This is why someone who plunks one coin after another into a slot machine without winning will have a strong and growing sense—the gambler’s fallacy—that a jackpot is “due,” even though every result is random and therefore unconnected to what came before. … and people believe that a sequence of random coin tosses that goes “THTHHT” is far more likely than the sequence “THTHTH” even though they are equally likely.
  • People are particularly disinclined to see randomness as the explanation for an outcome when their own actions are involved. Gamblers rolling dice tend to concentrate and throw harder for higher numbers, softer for lower. Psychologists call this the “illusion of control.” … they also found the illusion is stronger when it involves prediction. In a sense, the “illusion of control” should be renamed the “illusion of prediction.”
  • Overconfidence is a universal human trait closely related to an equally widespread phenomenon known as “optimism bias.” Ask smokers about the risk of getting lung cancer from smoking and they’ll say it’s high. But their risk? Not so high. … The evolutionary advantage of this bias is obvious: It encourages people to take action and makes them more resilient in the face of setbacks.
  • … How could so many experts have been so wrong? … A crucial component of the answer lies in psychology. For all the statistics and reasoning involved, the experts derived their judgements, to one degree or another, from what they felt to be true. And in doing so they were fooled by a common bias. … This tendency to take current trends and project them into the future is the starting point of most attempts to predict. Very often. it’s also the end point. That’s not necessarily a bad thing. After all, tomorrow typically is like today. Current trends do tend to continue. But not always. Change happens. And the further we look into the future, the more opportunity there is for current rends to be modified, bent, or reversed. Predicting the future by projecting the present is like driving with no hands. It works while you are on a long stretch of straight road but even a gentle curve is trouble, and a sharp turn always ends in a flaming wreck.
  • When people attempt to judge how common something is—or how likely it is to happen in the future—they attempt to think of an example of that thing. If an example is recalled easily, it must be common. If it’s harder to recall, it must be less common. … Again, this is not a conscious calculation. The “availability heuristic” is a tool of the unconscious mind.
  • “deviating too far from consensus leaves one feeling potentially ostracized from the group, with the risk that one may be terminated.” (Robert Shiller) … It’s tempting to think that only ordinary people are vulnerable to conformity, that esteemed experts could not be so easily swayed. Tempting, but wrong. As Shiller demonstrated, “groupthink” is very much a disease that can strike experts. In fact, psychologist Irving Janis coined the term “groupthink” to describe expert behavior. In his 1972 classic, Victims of Groupthink, Janis investigated four high-level disasters: the defence of Pearl Harbour, the Bay of Pigs invasion, and escalation of the wars in Korea and Vietnam and demonstrated that conformity among highly educated, skilled, and successful people working in their fields of expertise was a root cause in each case.
  • (On corporate use of scenario planning)… Scenarios are not predictions, emphasizes Peter Schwartz, the guru of scenario planning. “They are tools for testing choices.” The idea is to have a clever person dream up a number of very different futures, usually three to four. … Managers then consider the implications of each, forcing them out of the rut of the status quo, and thinking about what they would do if confronted with real change. The ultimate goal is to make decisions that would stand up well in a wide variety of contexts. No one denies there maybe some value in such exercises. But how much value? The consultants who offer scenario planning services are understandably bullish, but ask them for evidence and they typically point to examples of scenarios that accurately foreshadowed the future. That is silly, frankly. For one thing, it contradicts their claim that scenarios are not predictions and al the misses would have to be considered, and the misses vastly outnumber the hits. … Consultants also cite the enormous popularity of scenario planning as proof of its enormous value… Lack of evidence aside, there are more disturbing reasons to be wary of scenarios. Remember that what drives the availably heuristic is not how many examples the mind can recall but how easily they are recalled. … and what are scenarios? Vivid, colourful, dramatic stories. Nothing could be easier to remember or recall. And so being exposed to a dramatic scenario about (whatever)… will make the depicted events feel much more likely to happen.
  • (on not having control) At its core, torture is a process of psychological destruction. and that process almost always begins with the torturer explicitly telling the victim he his powerless. “I decide when you can eat and sleep. I decide when you suffer, how you suffer, if it will end. I decide if you live or die.” …Knowing what will happen in the future is a form of control, even if we cannot change what will happen. …Uncertainty is potent… people who experienced the mild-but-unpredictable shocks experienced much more fear than those who got the strong-but-predictable shocks.
  • Our profound aversion to uncertainty helps explain what would otherwise be a riddle: Why do people pay so much attention to dark and scary predictions? Why do gloomy forecasts so often outnumber optimistic predictions, take up more media space, and sell more books? Part of this predilection for gloom is simply an outgrowth of what is sometimes called negativity bias: our attention is drawn more swiftly by bad news or images, and we are more likely to remember them than cheery information….People who’s brains gave priority to bad news were much less likely to be eaten by lions or die some other pleasant death. … (negative) predictions are supported by our intuitive pessimism, so they feel right to us. And that conclusion is bolstered by our attraction to certainty. As strange as it sounds, we want to believe the expert predicting a dark future is less tormenting then suspecting it. Certainty is always preferable to uncertainty, even when what’s certain is disaster.
  • Researchers have also shown that financial advisors who express considerable confidence in their stock forecasts are more trusted than those who are less confident, even when their objective records are the same. … This “confidence heuristic” like the availability heuristics, isn’t necessarily a conscious decision path. We may not actually say to ourselves “she’s so sure of herself she must be right”…
  • (on our love for stories) Confirmation bias also plays a critical role for the very simple reason that none of us is a blank slate. Every human brain is a vast warehouse of beliefs and assumptions about the world and how it works. Psychologists call these “schemas.” We love stories that fit our schemas; they’re the cognitive equivalent of beautiful music. But a story that doesn’t fit – a story that contradicts basic beliefs – is dissonant.
  • … What makes this mass delusion possible is the different emphasis we put on predictions that hit and those that miss. We ignore misses, even when they lie scattered by the dozen at our feet; we celebrate hits, even when we have to hunt for them and pretend there was more to them that luck.
  • By giving us the sense that we should have predicted what is now the present, or even that we actually did predict it when we did not, it strong suggests that we can predict the future. This is an illusion, and yet it seems only logical – which makes it a particularly persuasive illusion.

If you like the notes you should buy Future Babble. Like the book summaries? Check out my notes from Adapt: Why Success Always Starts With Failure, The Ambiguities of Experience, On Leadership.

Subscribe to Farnam Street via twitteremail, or RSS.