Tag: Availability Bias

Choice Under Uncertainty

Some of the general heuristics—rules of thumb—that people use in making judgments that produce biases towards classifying situations according to their representativeness, or toward judging frequencies according to the availability of examples in memory, or toward interpretations warped by the way in which a problem has been framed. These heuristics have important implications for individuals and society.

Insensitivity to Base Rates
When people are given information about the probabilities of certain events (e.g., how many lawyers and how many engineers are in a population that is being sampled), and then are given some additional information as to which of the events has occurred (which person has been sampled from the population), they tend to ignore the prior probabilities in favor of incomplete or even quite irrelevant information about the individual event. Thus, if they are told that 70 percent of the population are lawyers, and if they are then given a noncommittal description of a person (one that could equally well fit a lawyer or an engineer), half the time they will predict that the person is a lawyer and half the time that he is an engineer–even though the laws of probability dictate that the best forecast is always to predict that the person is a lawyer.

Insensitivity to Sample Size
People commonly misjudge probabilities in many other ways. Asked to estimate the probability that 60 percent or more of the babies born in a hospital during a given week are male, they ignore information about the total number of births, although it is evident that the probability of a departure of this magnitude from the expected value of 50 percent is smaller if the total number of births is larger (the standard error of a percentage varies inversely with the square root of the population size).

Availability
There are situations in which people assess the frequency of a class by the ease with which instances can be brought to mind. In one experiment, subjects heard a list of names of persons of both sexes and were later asked to judge whether there were more names of men or women on the list. In lists presented to some subjects, the men were more famous than the women; in other lists, the women were more famous than the men. For all lists, subjects judged that the sex that had the more famous personalities was the more numerous.

Framing and Loss Aversion
The way in which an uncertain possibility is presented may have a substantial effect on how people respond to it. When asked whether they would choose surgery in a hypothetical medical emergency, many more people said that they would when the chance of survival was given as 80 percent than when the chance of death was given as 20 percent.

Source: Decision Making and Problem Solving, Herbert A. Simon

What you can do in the first 60 seconds of a presentation to aid your ability to persuade?

We know that first impressions are valuable. When our brain immediately likes someone we subconsciously tend to filter all subsequent information in a favorable light (aka, the halo effect).

In The Elements of Persuasion, authors Richard Maxwell and Robert Dickman suggest we:

…share something personal, and show the audience that you are talking to them, not simply giving a canned speech or sales pitch. if you think about it, this is exactly what the classic comedy act opening does. The comic strides onstage at the club and says, “Hi, I just got back from LA, and I’ve got to tell you those freeways are something else. Now, I’m from Brooklyn (something personal)… anyone else from Brooklyn? … “Yeah, Where? … Really I know that neighborhood. Wild Place. So, like i was saying, I’m not at all used to freeways …” By being personal and open to the audience, the comedian makes us think, “hey, I like this guy.” And it works even if we are aware of how carefully planned that interaction is.

Lessons of the Past

The tendency to relate contemporary events to earlier events as a guide to understanding is a powerful one. The difficulty, of course, is in being certain that two situations are truly comparable. Because they are similar in some respects does not assure us that they are similar in all respects.

 

We all know the way in which we set policy is flawed. Ernest May, someone you’ve likely never heard of, argues that in attempting to avoid the mistakes of the previous generations we pursue policies that would have been the most appropriate from a historical context. May argues that lawmakers mentally resort to analogies and tend to seize upon the first analogy that comes to mind without seeking evidence that would disprove their assumptions (like, say, highlighting differences between this event and the previous one before drawing conclusions?). After publicizing policy ambitions our politicians, like anyone, are likely to reject evidence that does not confirm their conclusions

Ernest May, in his book Lessons of the past, traced the impact of historical analogy on US foreign policy.

 

He found that because of reasoning by analogy, US policymakers tend to be one generation behind, determined to avoid the mistakes of the previous generation. They pursue the policies that would have been most appropriate in the historical situation but are not necessarily well adapted to the current one. 

Policymakers in the 1930s, for instance, viewed the international situation as analogous to that before World War I. Consequently, they followed a policy of isolation that would have been appropriate for preventing American involvement in the first World War but failed to prevent the second. Communist aggression after World War II was seen as analogous to Nazi aggression, leading to a policy of containment that could have prevented World War II. 

The Vietnam analogy had been used repeatedly over many years to argue against an activist US foreign policy. For example, some used the Vietnam analogy to argue against US participation in the Gulf War–a flawed analogy because the operating terrain over which battles were fought was completely different in Kuwait/Iraq and much more in our favor there as compared with Vietnam. 

May argues that policymakers often perceive problems in terms of analogies with the past, but that they ordinarily use history badly: When resorting to an analogy, they tend to seize upon the first that comes to mind. They do not research more widely. Nor do they pause to analyze the case, test its fitness, or even ask in what ways it might be misleading.

Hindsight Bias: Why You’re Not As Smart As You Think You Are

hindsight bias

Hindsight bias occurs when we look backward in time and see events are more predictable than they were at the time a decision was made. This bias, also known as the “knew-it-all-along effect,” typically involves those annoying “I told you so” people who never really told you anything.

For instance, consider driving in the car with your partner and coming to a T in the road. Your partner decides to turn right and 4 miles down the road when you realize you are lost you think “I knew we should have taken that left.”

Hindsight bias can offer a number of benefits in the short run. For instance, it can be flattering to believe that our judgment is better than it actually is. And, of course, hindsight bias allows us to participate in one of our favorite pastimes — criticizing the decisions of others for their lack of foresight.

“Judgments about what is good and what is bad, what is worthwhile and what is a waste of talent, what is useful and what is less so, are judgments that seldom can be made in the present. They can safely be made only by posterity.”

— Tulvings

Aside from helping aid in a more objective reflection of decisions, hindsight bias also has several practical implications. For example, consider someone asked to review a paper but knows the results of the previous review from someone else? Or a physician asked for a second opinion after knowing the results of the first. The results of these actions will likely be biased by some degree. Once we know an outcome it becomes easy to find some plausible explanation.

Hindsight bias helps us become less accountable for our decisions, less critical of ourselves, and over-confident in our ability to make decisions.

One of the most interesting things I discovered when researching hindsight bias was the impact on our legal system and the perceptions of jurors.

* * *

Harvard Professor Max Bazerman offers:

The processes that give rise to anchoring and overconfidence are also at play with the hindsight bias. According to this explanation, knowledge of an event’s outcome works as an anchor by which individuals interpret their prior judgments of the event’s likelihood. Due to the selective accessibility of the confirmatory information during information retrieval, adjustments to anchors are inadequate. Consequently, hindsight knowledge biases our perceptions of what we remember knowing in foresight. Furthermore, to the extent that various pieces of data about the event vary in support of actual outcome, evidence that is consistent with the known outcome may become cognitively more salient and thus more available in memory. This tendency will lead an individual to justify a claimed foresight in view of “the facts provided.” Finally, the relevance of a particular piece of that may later be judged important to the extent to which it is representative of the final observed outcome.

In Cognitive Illusions, Rudiger Pohl offered the following explanations of hindsight bias:

Most prominent among the proposes explanations are cognitive accounts which assume that hindsight bias results from an inability to ignore the solution. Among the early approaches are the following three: (1) Fischhoff (1975) assumed an immediate and irreversible assimilation of the solution into one’s knowledge base. As a consequence, the reconstructed estimate will be biased towards the solution. (2) Tversky and Kahneman (1974) proposed a cognitive heuristic for the anchoring effected, named anchoring and insufficient adjustment. The same mechanism may apply here, if the solution is assumed to serve as an “anchor” in the reconstruction process. The reconstruction starts from this anchor and is then adjusted in the direction of one’s knowledge base. However, this adjustment process may stop too early, for example at the point where the first plausible value is reached, thus ending to a biased reconstruction. (3) Hell (1988) argued that the relative trace strengths of the regional estimate and of the solution might predict the amount of hindsight bias. The stronger the trace strength of the solution relative to that of the original estimate, the larger hindsight bias should be.

Pohl also offers an evolutionary explanation of hindsight bias:

Finally, some authors argued that hindsight bias is not necessarily a bothersome consequence of a “faulty” information process system, but that is may rather represent an unavoidable by-product of an evolutionary evolved function, namely adaptive learning. According to this view, hindsight bias is seen as the consequence of our most valuable ability to update previously held knowledge. This may be seen as a necessary process in order to prevent memory overload and thus to maintain normal cognitive functioning. Besides, updating allows us to keep our knowledge more coherent and to draw better inferences.

Ziva Junda, in social cognition, offers the following explanation of why hindsight bias occurs:

Preceding events take on new meaning and importance as they are made to cohere with the known outcome. Now that we know that our friends have filed for divorce, any ambiguous behavior we have seen is reinterpreted as indicative of tension, any disagreement gains significance, and any signs of affection seem irrelevant. It now seems obvious that their marriage was doomed from the start…Moreover, having adjusted our interpretations in light of current knowledge, it is difficult to imagine how things could have happened differently.

When making likelihood judgments, we often rely on the availability heuristic: The more difficult it is for us to imagine an outcome, the more unlikely it seems. Therefore, the difficulty we experience imagining how things might have turned out differently makes us all the more convinced that the outcomes that did occur were bound to have occurred.

Hindsight bias has large implications for criminal trials. In Jury Selection Hale Starr and Mark McCormick offer the following:

The effects of hindsight bias – which result in being held to a higher standard – are most critical for both criminal and civil defendants. The defense is more susceptible to the hindsight bias since their actions are generally the ones being evaluated fro reasonableness in foresight-foreseeability. When jurors perceive that the results of particular actions were “reasonably” more likely after the outcome is known, defendants are judged as having been capable of knowing more than they knew at the time the action was taken and therefore as capable of preventing the “bad” outcome.

In post-verdict surveys jurors unknowingly demonstrate some of the effects of hindsight bias:

“I can’t understand why the managers didn’t try to get more information or use the information they had available. They should have known there would be safety problems at the plant”.

“The defendants should have known people would remove the safety shield around the tire. There should have been warnings so people wouldn’t do that”

“Even though he was a kid, he should have known that once he showed the others who had been drinking that he had a gun, things would get out of hand. He should have known guns invited violence.

Jurors influenced by the hindsight bias look at the evidence presented and determine that the defendants knew or should have known their actions were unsafe, unwise, or created a dangerous situation. Hindsight bias often results in the judgment that the event was “an accident or tragedy waiting to happen.”

* * *

Protection Against Hindsight Bias

In Principles of Forecasting, Jon Scott Armstrong, offers the following advice on how to protect yourself:

The surest protection against (hindsight bias) is disciplining ourselves to make explicit predictions, showing what we did in fact know (sounds like a decision journal). That record can also provide us with some protection against those individuals who are wont to second guess us, producing exaggerated claims of what we should have known (and perhaps should have told them). If these observers look to this record, it may show them that we are generally less proficient as forecaster than they would like while protecting us against charges of having blown a particular assignment. Having an explicit record can also protect us against overconfidence in our own forecasting ability: If we feel that we “knew all along” what was going to happen, then it is natural enough to think that we will have similar success in the future. Unfortunately, an exaggerated perception of a surprise-free past maybe portend a surprised-full future.

Documenting the reasons we made a forecast makes it possible for us to know not only how well the forecast did, but also where it went astray. For example, subsequent experiences may show that we used wrong (or misunderstood) inputs. In that case, we can, in principle, rerun the forecasting process with better inputs and assess the accuracy of our (retrospectively) revised forecasts. Perhaps we did have the right theory and procedures, but were applying them to a mistaken picture of then-current conditions…Of course inputs are also subject to hindsight bias, hence we need to record them explicitly as well. The essence of making sense out of outcome knowledge is reinterpreting the processes and conditions that produced the reported event.

Hindsight Bias is part of the Farnam Street latticework of mental models.