Tag: Daniel Kahneman

Daniel Kahneman: Some Thoughts on Thinking

While Daniel Kahneman’s book, Thinking Fast, Thinking Slow, gets all the attention, he’s also written a few articles that might catch your interest on thinking better.

Optimistic Bias:

In terms of its consequences for decisions, the optimistic bias may well be the most significant cognitive bias. Because optimistic bias is both a blessing and a risk, you should be both happy and wary if you are temperamentally optimistic.

Competition Neglect

Colin Camerer, who coined the concept of competition neglect, illustrated it with a quote from a chairman of Disney Studios. Asked why so many big-budget movies are released on the same holidays, he said, “Hubris. Hubris. If you only think about your own business, you think, ‘I’ve got a good story department, I’ve got a good marketing department’ … and you don’t think that everybody else is thinking the same way.” The competition isn’t part of the decision. In other words, a difficult question has been replaced by an easier one.

This is a kind of dodge we all make, without even noticing. We use fast, intuitive thinking — System 1 thinking — whenever possible, and switch over to more deliberate and effortful System 2 thinking only when we truly recognize that the problem at hand isn’t an easy one.

The question that studio executives needed to answer is this: Considering what others will do, how many people will see our film? The question they did consider is simpler and refers to knowledge that is most easily available to them: Do we have a good film and a good organization to market it?

Appreciation of uncertainty

As Nassim Taleb, the author of “The Black Swan,” has argued, inadequate appreciation of the uncertainty of the environment inevitably leads economic agents to take risks they should avoid. However, optimism is highly valued; people and companies reward the providers of misleading information more than they reward truth tellers. An unbiased appreciation of uncertainty is a cornerstone of rationality — but it isn’t what organizations want. Extreme uncertainty is paralyzing under dangerous circumstances, and the admission that one is merely guessing is especially unacceptable when the stakes are high. Acting on pretended knowledge is often the preferred approach.

Theory-induced blindness

Bernoulli invented psychophysics to explain this aversion to risk. His idea was straightforward: People’s choices are based not on dollar values but on the psychological values of outcomes, their utilities. The psychological value of a gamble is therefore not the weighted average of its possible dollar outcomes; it is the average of the utilities of these outcomes, each weighted by its probability.

…That Bernoulli’s theory prevailed for so long is even more remarkable when you see that, in fact, it is seriously flawed. The errors are found not in what it asserts explicitly, but what it tacitly assumes.

The mystery is how a conception that is vulnerable to such obvious counterexamples survived for so long. I can explain it only by a weakness of the scholarly mind that I have often observed in myself. I call it theory-induced blindness: Once you have accepted a theory, it is extraordinarily difficult to notice its flaws. As the psychologist Daniel Gilbert has observed, disbelieving is hard work.

Reference points and loss aversion

The economists Devin Pope and Maurice Schweitzer, at the University of Pennsylvania, suggest that golf provides the perfect example of a reference point: par. For a professional golfer, a birdie (one stroke under par) is a gain, and a bogey (one stroke over par) is a loss. Failing to make par is a loss, but missing a birdie putt is a forgone gain, not a loss. Pope and Schweitzer analyzed more than 2.5 million putts to test their prediction that players would try harder when putting for par than when putting for a birdie.

They were right. Whether the putt was easy or hard, at every distance from the hole, players were more successful when putting for par than for a birdie.

…If you are set to look for it, the asymmetric intensity of the motives to avoid losses and to achieve gains shows up almost everywhere. It is an ever-present feature of negotiations, especially of renegotiations of an existing contract, the typical situation in labor negotiations, and in international discussions of trade or arms limitations. Loss aversion creates an asymmetry that makes agreements difficult to reach.

Negotiations over a shrinking pie are especially difficult because they require an allocation of losses. People tend to be much more easygoing when they bargain over an expanding pie.

In the world of territorial animals, the principle of loss aversion explains the success of defenders. A biologist observed that “when a territory holder is challenged by a rival, the owner almost always wins the contest — usually within a matter of seconds.”

The Planning Fallacy

David Books, author of The Social Animal, with an excellent column on the planning fallacy:

In his forthcoming book (now released), Thinking, Fast and Slow, Kahneman calls this the planning fallacy. Most people overrate their own abilities and exaggerate their capacity to shape the future. That’s fine. Optimistic people rise in this world. The problem comes when these optimists don’t look at themselves objectively from the outside.

The planning fallacy is failing to think realistically about where you fit in the distribution of people like you. As Kahneman puts it, “People who have information about an individual case rarely feel the need to know the statistics of the class to which the case belongs.”

Over the past three years, the United States has been committing the planning fallacy on stilts. The world economy has been slammed by a financial crisis. Countries that are afflicted with these crises typically experience several years of high unemployment. They go deep into debt to end the stagnation, but the turnaround takes a while.

This historical pattern has been universally acknowledged and universally ignored. Instead, leaders in both parties have clung to the analogy that the economy is like a sick patient who can be healed by the right treatment.

3 Things Everyone Should Know About the Availability Heuristic

There are 3 things you should know about the availability heuristic:

  1. We often misjudge the frequency and magnitude of events that have happened recently.
  2. This happens, in part, because of the limitations on memory.
  3. We remember things better when they come in a vivid narrative.

***

There are two biases emanating from the availability heuristic (a.k.a. the availability bias): Ease of recall and retrievability.

Because of the availability bias, our perceptions of risk may be in error and we might worry about the wrong risks. This can have disastrous impacts.

Ease of recall suggests that if something is more easily recalled in memory it must occur with a higher probability.

The availability heuristic distorts our understanding of real risks.

“The attention which we lend to an experience is proportional to its vivid or interesting character; and it is a notorious fact that what interests us most vividly at the time is, other things equal, what we remember best.”

— William James

When we make decisions we tend to be swayed by what we remember. What we remember is influenced by many things including beliefs, expectations, emotions, and feelings as well as things like frequency of exposure.  Media coverage (e.g., Internet, radio, television) makes a big difference. When rare events occur they become very visible to us as they receive heavy coverage by the media. This means we are more likely to recall it, especially in the immediate aftermath of the event. However, recalling an event and estimating its real probability are two different things. If you’re in a car accident, for example, you are likely to rate the odds of getting into another car accident much higher than base rates would indicate.

Retrievability suggests that we are biased in assessments of frequency in part because of our memory structure limitations and our search mechanisms. It’s the way we remember that matters.

The retrievability and ease of recall biases indicate that the availability bias can substantially and unconsciously influence our judgment. We too easily assume that our recollections are representative and true and discount events that are outside of our immediate memory.

***

In Thinking Fast and Slow, Kahneman writes:

People tend to assess the relative importance of issues by the ease with which they are retrieved from memory—and this is largely determined by the extent of coverage in the media.

***

Nobel Prize winning Social Scientist and Father of Artificial Intelligence, Herbert Simon, wrote in Models of My life:

I soon learned that one wins awards mainly for winning awards: an example of what Bob Merton calls the Matthew Effect. It is akin also to the phenomenon known in politics as “availability,” or name recognition. Once one becomes sufficiently well known, one’s name surfaces automatically as soon as an award committee assembles.

* * *

According to Harvard professor Max Bazerman

Many life decisions are affected by the vividness of information. Although most people recognize that AIDS is a devastating disease, many individuals ignore clear data about how to avoid contracting AIDS. In the fall of 1991, however, sexual behavior in Dallas was dramatically affected by one vivid piece of data that may or may not have been true. In a chilling interview, a Dallas woman calling herself C.J. claimed she had AIDS and was trying to spread the disease out of revenge against the man who had infected her. After this vivid interview made the local news, attendance at Dallas AIDS seminary increased dramatically. Although C.J.’s possible actions were a legitimate cause for concern, it is clear that most of the health risks related to AIDS are not a result of one woman’s actions. There are many more important reasons to be concerned about AIDS. However, C.J.’s vivid report had a more substantial effect on many people’s behavior than the mountains of data available. The Availability Heuristic describes the inferences we make about even commonness based on the ease with which we can remember instances of that event

While this example of vividness may seem fairly benign, it is not difficult to see how the availability bias could lead managers to make potentially destructive workplace decisions. The following came from the experience of one of our MBA students: As a purchasing agent, he had to select one of several possible suppliers. He chose the firm with whose name was the most familiar to him. He later found out that the salience of the name resulted from recent adverse publicity concerning the firm’s extortion of funds from client companies!

Managers conducting performance appraisals often fall victim to the availability heuristic. Working from memory, vivid instances of an employee’s behavior (either positive or negative) will be most easily recalled from memory, will appear more numerous than commonplace incidents, and will therefore be weighted more heavily in the performance appraisals. The recency of events is also a factor: Managers give more weight to performance during the three months prior to the evaluation than to the previous nine months of the evaluation period because it is more available in memory.

* * *

There are numerous implications for availability bias for investors.

A study by Karlsson, Loewenstein, and Ariely (2008) showed that people are more likely to purchase insurance to protect themselves after a natural disaster they have just experienced than they are to purchase insurance on this type of disaster before it happens.

Bazerman adds:

This pattern may be sensible for some types of risks. After all, the experience of surviving a hurricane may offer solid evidence that your property is more vulnerable to hurricanes than you had thought or that climate change is increasing your vulnerability to hurricanes.

Robyn M. Dawes, in his book Everyday Irrationality, says:

What is a little less obvious is that people can make judgments of the ease with which instances can come to mind without actually recalling specific instances. We know, for example, whether we can recall the presidents of the United States–or rather how well we can recall their names; moreover, we know at which periods of history we are better at recalling them than at which other periods. We can make judgments without actually listing in our minds the names of the specific presidents.

This recall of ease of creating instances is not limited to actual experience, but extends to hypothetical experience as well. For example, subjects are asked to consider how many subcommittees of two people can be formed from a committee of eight, and either the same or other subjects are asked to estimate how many subcommittees of six can be formed from a committee of eight people. It is much easier to think about pairs of people than to think about sets of six people, with the result that the estimate of pairs tends to be much higher than the estimate of subsets of six. In point of logic, however, the number of subsets of two is identical that of six; the formation of a particular subset of two people automatically involves the formation of a particular subset consisting of the remaining six. Because these unique subsets are paired together, there are the same number of each.

This availability to the imagination also creates a particularly striking irrationality, which can be termed with the conjunction fallacy or compound probability fallacy. Often combinations of events or entities are easier to think about than their components, because the combination might make sense whereas the individual component does not. A classic example is that of a hypothetical woman names Linda who is said to have been a social activist majoring in philosophy as a college undergraduate. What is the probability that at age thirty she is a bank teller? Subjects judge the probability as very unlikely. But when asked whether she might be a bank teller active in a feminist movement, subjects judge this combination to be more likely than for her to be a bank teller.

* * *

Retrievability (based on memory structures)

We are better at retrieving words from memory using the word’s initial letter than a random position like 3 (Tversky & Kahneman, 1973).

In 1984 Tverksy and Kahneman demonstrated the retrievability bias again when they asked participants in their study to estimate the frequency of seven-letter words that had the letter “n” in the sixth position. Their participants estimated such words to be less common than seven letter words ending in the more memorable “ing”. This response is incorrect. All seven letter words ending with “ing” also have an “n” in the sixth position. However, it’s easy to recall seven letter words ending with ing. As we demonstrated with Dawes above, this is another example of the conjunction fallacy.

Retail locations are chosen based on search as well, which explains why gas stations and retail stores are often “clumped” together. Consumers learn the location of a product and organize their mind accordingly. While you may not remember the name of all three gas stations on the same corner, your mind tells you that is where to go to find gas. Each station, assuming all else equal, then has a 1/3 shot at your business which is much better than gas stations you don’t visit because their location doesn’t resonate with your minds search. In order to maximize traffic stores must find locations that consumers associate with a product.

* * *

Exposure Effect

People tend to develop a preference for things because they are familiar with them. This is called the exposure effect. According to Titchener (1910) the exposure effect leads people to experience a “glow or warmth, a sense of ownership, a feeling of intimacy.”

The exposure effect applies only to things that are perceived as neutral to positive. If you are repeatedly exposed to something perceived as a negative stimuli it may in fact amplify negative feelings. For example, when someone is playing loud music you tend to have a lot of patience at first. However, as time goes on you get increasingly aggravated as your exposure to the stimuli increases.

The more we are exposed to something the easier it is to recall in our minds. The exposure effect influences us in many ways. Think about brands, stocks, songs, companies, and even the old saying “the devil you know.”

* * *

The Von Restorff Effect

“One of these things doesn’t belong,” can accurately summarize the Von Restorff Effect (also known as the isolation effect and novelty effect). In our minds, things that stand out are more likely to be remembered and recalled because we give increased attention to distinctive items in a set.

For example, if I asked you to remember the following sequence of characters “RTASDT9RTGS” I suspect the most common character remembered would be the “9” because it stands out and thus your mind gives it more attention.

The Von Restorff Effect leads us to Vivid evidence.

* * *

Vivid Evidence

According to William James in the Principles of Psychology:

An impression may be so exciting emotionally as to almost leave a scar upon the cerebral tissues; and thus originates a pathological delusion. For example “A woman attacked by robbers takes all the men whom she sees, even her own son, for brigands bent on killing her. Another woman sees her child run over by a horse; no amount of reasoning, not even the sight of the living child, will persuade her that he is not killed.

M. Taine wrote:

If we compare different sensations, images, or ideas, we find that their aptitudes for revival are not equal. A large number of them are obliterated, and never reappear throughout life; for instance, I drove through Paris a day or two ago, and though I saw plainly some sixty or eighty new faces, I cannot now recall any one of them; some extraordinary circumstance, a fit of delirium, or the excitement of hashish would be necessary to give me a chance at revival. On the other hand, there are sensations with a force of revival which nothing destroys or decreases. Though, as a rule, time weakens and impairs our strongest sensations, these reappear entire and intense, without having lost a particle of their detail, or any degree of their force. M. Breirre de Boismont, having suffered when a child from a disease of the scalp, asserts that ‘after fifty-five years have elapsed he can still feel his hair pulled out under the treatment of the ‘skull-cap.’–For my own part, after thirty years, I remember feature for feature the appearance of the theater to which I was taken for the first time. From the third row of boxes, the body of the theater appeared to me an immense well, red and flaming, swarming with heads; below, on the right, on a narrow floor, two men and a woman entered, went out, and re-entered, made gestures, and seemed to me like lively dwarfs: to my great surprise one of these dwarfs fell on his knees, kissed the lady’s hand, then hid behind a screen: the other, who was coming in, seemed angry, and raised his arm. I was then seven, I could understand nothing of what was going on; but the well of crimson velvet was so crowded, and bright, that after a quarter of an hour i was, as it were, intoxicated, and fell asleep.

Every one of us may find similar recollections in his memory, and may distinguish them in a common character. The primitive impression has been accompanied by an extraordinary degree of attention, either as being horrible or delightful, or as being new, surprising, and out of proportion to the ordinary run of life; this it is we express by saying that we have been strongly impressed; that we were absorbed, that we could not think of anything else; that our other sensations were effaced; that we were pursued all the next day by the resulting image; that it beset us, that we could not drive it away; that all distractions were feeble beside it. It is by force of this disproportion that impressions of childhood are so persistent; the mind being quite fresh, ordinary objects and events are surprising…

Whatever may be the kind of attention, voluntary or involuntary, it always acts alike; the image of an object or event is capable of revival, and of complete revival, in proportion to the degree of attention with which we have considered the object or event. We put this rule into practice at every moment in ordinary life.

An example from Freeman Dyson:

A striking example of availability bias is the fact that sharks save the lives of swimmers. Careful analysis of deaths in the ocean near San Diego shows that on average, the death of each swimmer killed by a shark saves the lives of ten others. Every time a swimmer is killed, the number of deaths by drowning goes down for a few years and then returns to the normal level. The effect occurs because reports of death by shark attack are remembered more vividly than reports of drownings.

Availability Bias is a Mental Model in the Farnam Street Mental Model Index

A Simple Checklist to Improve Decisions

We owe thanks to the publishing industry. Their ability to take a concept and fill an entire category with a shotgun approach is the reason that more people are talking about biases.

Unfortunately, talk alone will not eliminate them but it is possible to take steps to counteract them. Reducing biases can make a huge difference in the quality of any decision and it is easier than you think.

In a recent article for Harvard Business Review, Daniel Kahneman (and others) describe a simple way to detect bias and minimize its effects in the most common type of decisions people make: determining whether to accept, reject, or pass on a recommendation.

The Munger two-step process for making decisions is a more complete framework, but Kahneman’s approach is a good way to help reduce biases in our decision-making.

If you’re short on time here is a simple checklist that will get you started on the path towards improving your decisions:

Preliminary Questions: Ask yourself

1. Check for Self-interested Biases

  • Is there any reason to suspect the team making the recommendation of errors motivated by self-interest?
  • Review the proposal with extra care, especially for overoptimism.

2. Check for the Affect Heuristic

  • Has the team fallen in love with its proposal?
  • Rigorously apply all the quality controls on the checklist.

3. Check for Groupthink

  • Were there dissenting opinions within the team?
  • Were they explored adequately?
  • Solicit dissenting views, discreetly if necessary.
  • Challenge Questions: Ask the recommenders

4. Check for Saliency Bias

  • Could the diagnosis be overly influenced by an analogy to a memorable success?
  • Ask for more analogies, and rigorously analyze their similarity to the current situation.

5. Check for Confirmation Bias

  • Are credible alternatives included along with the recommendation?
  • Request additional options.

6. Check for Availability Bias

  • If you had to make this decision again in a year’s time, what information would you want, and can you get more of it now?
  • Use checklists of the data needed for each kind of decision.

7. Check for Anchoring Bias

  • Do you know where the numbers came from? Can there be
  • …unsubstantiated numbers?
  • …extrapolation from history?
  • …a motivation to use a certain anchor?
  • Reanchor with figures generated by other models or benchmarks, and request new analysis.

8. Check for Halo Effect

  • Is the team assuming that a person, organization, or approach that is successful in one area will be just as successful in another?
  • Eliminate false inferences, and ask the team to seek additional comparable examples.

9. Check for Sunk-Cost Fallacy, Endowment Effect

  • Are the recommenders overly attached to a history of past decisions?
  • Consider the issue as if you were a new CEO.
  • Evaluation Questions: Ask about the proposal

10. Check for Overconfidence, Planning Fallacy, Optimistic Biases, Competitor Neglect

  • Is the base case overly optimistic?
  • Have the team build a case taking an outside view; use war games.

11. Check for Disaster Neglect

  • Is the worst case bad enough?
  • Have the team conduct a premortem: Imagine that the worst has happened, and develop a story about the causes.

12. Check for Loss Aversion

  • Is the recommending team overly cautious?
  • Realign incentives to share responsibility for the risk or to remove risk.

If you’re looking to dramatically improve your decision making here is a great list of books to get started:

Nudge: Improving Decisions About Health, Wealth, and Happiness by Richard H. Thaler and Cass R. Sunstein

Think Twice: Harnessing the Power of Counterintuition by Michael J. Mauboussin

Think Again: Why Good Leaders Make Bad Decisions and How to Keep It from Happening to You by Sydney Finkelstein, Jo Whitehead, and Andrew Campbell

Predictably Irrational: The Hidden Forces That Shape Our Decisions by Dan Ariely

Thinking, Fast and Slow by Daniel Kahneman

Judgment and Managerial Decision Making by Max Bazerman