Tag: Anchoring

Illusion of Transparency: Your Poker Face is Better Than You Think

We tend to think that people can easily tell what we’re thinking and feeling. They can’t. Understanding the illusion of transparency bias can improve relationships, job performance, and more.

***

“A wonderful fact to reflect upon, that every human creature is constituted to be that profound secret and mystery to every other.” ― Charles Dickens, A Tale of Two Cities

When we experience strong emotions, we tend to think it’s obvious to other people, especially those who know us well. When we’re angry or tired or nervous or miserable, we may assume that anyone who looks at our face can spot it straight away.

That’s not true. Most of the time, other people can’t correctly guess what we’re thinking or feeling. Our emotions are not written all over our face all the time. The gap between our subjective experience and what other people pick up on is known as the illusion of transparency. It’s a fallacy that leads us to overestimate how easily we convey our emotions and thoughts.

For example, you arrive at the office exhausted after a night with too little sleep. You drift around all day, chugging espressos, feeling sluggish and unfocused. Everything you do seems to go wrong. At the end of the day, you sheepishly apologize to a coworker for being “useless all day.”

They look at you, slightly confused. ‘Oh,’ they say. ‘You seemed fine to me.’ Clearly, they’re just being polite. There’s no way your many minor mistakes during the day could have escaped their notice. It must be extra apparent considering your coworkers all show up looking fresh as a daisy every single day.

Or imagine that you have to give a talk in front of a big crowd and you’re terrified. As you step on stage, your hands shake, your voice keeps catching in your throat, you’re sweating and flushed. Afterward, you chat to someone from the audience and remark: ‘So that’s what a slow-motion panic attack looks like.’

‘Well, you seemed like a confident speaker,’ they say. ‘You didn’t look nervous at all. I wish I could be as good at public speaking.’ Evidently, they were sitting at the back or they have bad eyesight. Your shaking hands and nervous pauses were far too apparent. Especially compared to the two wonderful speakers who came after you.

No one cares

“Words are the source of misunderstandings.” ― Antoine de Saint-Exupéry, The Little Prince

The reality is that other people pay much less attention to you than you think. They’re often far too absorbed in their own subjective experiences to pick up on subtle cues related to the feelings of others. If you’re annoyed at your partner, they’re probably too busy thinking about what they need to do at work tomorrow or what they’re planning to cook for dinner to scrutinize your facial expressions. They’re not deliberately ignoring you, they’re just thinking about other things. While you’re having a bad day at work, your coworkers are probably distracted by their own deadlines and personal problems. You could fall asleep sitting up and many of them wouldn’t even notice. And when you give a talk in front of people, most of them are worrying about the next time they have to do any public speaking or when they can get a coffee.

In your own subjective experience, you’re in the eye of the storm. But what other people have to go on are things like your tone of voice, facial expressions, and body language. The clues these provide can be hard to read. Unless someone is trying their best to figure out what you’re thinking or feeling, they’re not going to be particularly focused on your body language. If you make even the slightest effort to conceal your inner state, you’re quite able to hide it altogether from everyone.

Our tendency to overestimate how much attention people are paying to us is a result of seeing our own perspective as the only perspective. If we’re feeling a strong emotion, we assume other people care about how we feel as much as we do. This egocentric bias leads to the spotlight effect—in social situations, we feel like there’s a spotlight shining on us. It’s not self-obsession, it’s natural. But overall, this internal self-focus is what makes you think other people can tell what you’re thinking.

Take the case of lying. Even if we try to err on the side of honesty, we all face situations where we feel we have no option except to tell a lie. Setting aside the ethics of the matter, most of us probably don’t feel good about lying. It makes us uncomfortable. It’s normal to worry that whoever you’re lying to will easily be able to tell. Again, unless you’re being very obvious, the chances of someone else picking up on it are smaller than you think. In one study, participants asked to lie to other participants estimated they’d be caught about half the time. In fact, people only guessed they were lying about a quarter of the time—a rate low enough for random chance to account for it.

Tactics

“Even if one is neither vain nor self-obsessed, it is so extraordinary to be oneself—exactly oneself and no one else—and so unique, that it seems natural that one should also be unique for someone else.” ― Simone de Beauvoir

Understanding how the illusion of transparency works can help you navigate otherwise challenging situations with ease.

Start with accepting that other people don’t usually know what you’re thinking and feeling. If you want someone to know your mental state, you need to tell them in the clearest terms possible. You can’t make assumptions. Being subtle about your feelings is not the best idea, especially in high-stakes situations. Err on the side of caution whenever possible by communicating plainly in words about your feelings or views.

Likewise, if you think you know how someone else feels, you should ask them to confirm. You shouldn’t assume you’ve got it right—you probably haven’t. If it’s important, you need to double check. The person who seems calm on the surface might be frenzied underneath. Some of us just appear unhappy to others all the time, no matter how we’re feeling. If you can’t pick up on someone’s mental state, they might not be vocalizing it because they think it’s obvious. So ask.

As Dylan Evans writes in Risk Intelligence: How To Live With Uncertainty,

The first and most basic remedy is simply to treat all your hunches about the thoughts and feelings of other people with a pinch of salt and to be similarly skeptical about their ability to read your mind. It can be hard to resist the feeling that someone is lying to you, or that your own honesty will shine through, but with practice it can be done.

The illusion of transparency doesn’t go away just because you know someone well. Even partners, family members and close friends have difficulty reading each other’s mental states. The problem compounds when we think they should be able to do this. We can easily become annoyed when they can’t. If you’re upset or angry and someone close to you doesn’t make any attempt to make you feel better, they are not necessarily ignoring you. They just haven’t noticed anything is wrong, or they may not know how you want them to respond. As Hanlon’s razor teaches us, it’s best not to assume malicious intent. Understanding this can help avoid arguments that spring up based on thinking we’re communicating clearly when we’re not.

“Much unhappiness has come into the world because of bewilderment and things left unsaid.” ― Fyodor Dostoevsky

Set yourself free

Knowing about the illusion of transparency can be liberating. Guess what? No one really cares. Or almost no one. If you’ve got food stuck between your teeth or you stutter during a speech or you’re exhausted at work, you might as well assume no one has noticed. Most of the time, they haven’t.

Back to public speaking: We get it all wrong when we think people can tell we’re nervous about giving a talk. In a study entitled “The illusion of transparency and the alleviation of speech anxiety,” Kenneth Savitskya and Thomas Gilovich tested how knowing about the effect could help people feel less scared about public speaking.1 When participants were asked to give a speech, their self-reported levels of nervousness were well above what audience members guessed they were experiencing. Inside, they felt like a nervous wreck. On the outside, they looked calm and collected.

But when speakers learned about the illusion of transparency beforehand, they were less concerned about audience perceptions and therefore less nervous. They ended up giving better speeches, according to both their own and audience assessments. It’s a lot easier to focus on what you’re saying if you’re not so worried about what everyone else is thinking.

The sun revolves around me, doesn’t it?

In psychology, anchoring refers to our tendency to make an estimated guess by selecting whatever information is easily available as our “anchor,” then adjusting from that point. Often, the adjustments are insufficient. This is exactly what happens when you try to guess the mental state of others. If we try to estimate how a friend feels, we take how we feel as our starting point, then adjust our guess from there.

According to the authors of a paper entitled “The Illusion of Transparency: Biased Assessments of Other’s Ability to Read One’s Emotional States,”

People are typically quite aware of their own internal states and tend to focus on them rather intently when they are strong. To be sure, people recognize that others are not privy to the same information as they are, and they attempt to adjust for this fact when trying to anticipate another’s perspective. Nevertheless, it can be hard to get beyond one’s own perspective even when one knows that.

This is similar to hindsight bias, where things seem obvious in retrospect, even if they weren’t beforehand. When you look back on an event, it’s hard to disentangle what you knew then from what you know now. You can only use your current position as an anchor, a perspective which is inevitably skewed.

If you’re trying to hide your mental state, you’re probably doing better than you think. Unless you’re talking to, say, a trained police interrogator or professional poker player, other people are easy to fool. They’re not looking that hard, so a mild effort to hide your emotions is likely to work well. People can’t read your mind, whether you’re trying to pretend you don’t hate the taste of a trendy new beer, or trying to conceal your true standing in a negotiation to gain more leverage.

The illusion of transparency explains why, even once you’re no longer a teenager, it still seems like few people understand you. It’s not that other people are ambivalent or confused. Your feelings just aren’t as clear as you think. Often you can’t see beyond the confines of your own head and neither can anyone else. It’s best to make allowances for that.

Footnotes
  • 1

    https://rsrc.psychologytoday.com/files/u47/sdarticle.pdf

A Simple Checklist to Improve Decisions

We owe thanks to the publishing industry. Their ability to take a concept and fill an entire category with a shotgun approach is the reason that more people are talking about biases.

Unfortunately, talk alone will not eliminate them but it is possible to take steps to counteract them. Reducing biases can make a huge difference in the quality of any decision and it is easier than you think.

In a recent article for Harvard Business Review, Daniel Kahneman (and others) describe a simple way to detect bias and minimize its effects in the most common type of decisions people make: determining whether to accept, reject, or pass on a recommendation.

The Munger two-step process for making decisions is a more complete framework, but Kahneman’s approach is a good way to help reduce biases in our decision-making.

If you’re short on time here is a simple checklist that will get you started on the path towards improving your decisions:

Preliminary Questions: Ask yourself

1. Check for Self-interested Biases

  • Is there any reason to suspect the team making the recommendation of errors motivated by self-interest?
  • Review the proposal with extra care, especially for overoptimism.

2. Check for the Affect Heuristic

  • Has the team fallen in love with its proposal?
  • Rigorously apply all the quality controls on the checklist.

3. Check for Groupthink

  • Were there dissenting opinions within the team?
  • Were they explored adequately?
  • Solicit dissenting views, discreetly if necessary.
  • Challenge Questions: Ask the recommenders

4. Check for Saliency Bias

  • Could the diagnosis be overly influenced by an analogy to a memorable success?
  • Ask for more analogies, and rigorously analyze their similarity to the current situation.

5. Check for Confirmation Bias

  • Are credible alternatives included along with the recommendation?
  • Request additional options.

6. Check for Availability Bias

  • If you had to make this decision again in a year’s time, what information would you want, and can you get more of it now?
  • Use checklists of the data needed for each kind of decision.

7. Check for Anchoring Bias

  • Do you know where the numbers came from? Can there be
  • …unsubstantiated numbers?
  • …extrapolation from history?
  • …a motivation to use a certain anchor?
  • Reanchor with figures generated by other models or benchmarks, and request new analysis.

8. Check for Halo Effect

  • Is the team assuming that a person, organization, or approach that is successful in one area will be just as successful in another?
  • Eliminate false inferences, and ask the team to seek additional comparable examples.

9. Check for Sunk-Cost Fallacy, Endowment Effect

  • Are the recommenders overly attached to a history of past decisions?
  • Consider the issue as if you were a new CEO.
  • Evaluation Questions: Ask about the proposal

10. Check for Overconfidence, Planning Fallacy, Optimistic Biases, Competitor Neglect

  • Is the base case overly optimistic?
  • Have the team build a case taking an outside view; use war games.

11. Check for Disaster Neglect

  • Is the worst case bad enough?
  • Have the team conduct a premortem: Imagine that the worst has happened, and develop a story about the causes.

12. Check for Loss Aversion

  • Is the recommending team overly cautious?
  • Realign incentives to share responsibility for the risk or to remove risk.

If you’re looking to dramatically improve your decision making here is a great list of books to get started:

Nudge: Improving Decisions About Health, Wealth, and Happiness by Richard H. Thaler and Cass R. Sunstein

Think Twice: Harnessing the Power of Counterintuition by Michael J. Mauboussin

Think Again: Why Good Leaders Make Bad Decisions and How to Keep It from Happening to You by Sydney Finkelstein, Jo Whitehead, and Andrew Campbell

Predictably Irrational: The Hidden Forces That Shape Our Decisions by Dan Ariely

Thinking, Fast and Slow by Daniel Kahneman

Judgment and Managerial Decision Making by Max Bazerman

How Williams Sonoma Inadvertently Sold More Bread Machines

Paying attention to what your customers and clients see can be a very effective way to increase your influence and, subsequently, your business.

Steve Martin, co-author of Yes! 50 Secrets from the Science of Persuasion, tells the story:

A few years ago a well-known US kitchen retailer released its latest bread-making machine. Like any company bringing a new and improved product to market, it was excited about the extra sales revenues the product might deliver. And, like most companies, it was a little nervous about whether it had done everything to get its product launch right.

It needn’t have worried. Within a few weeks, sales had almost doubled. Surprisingly, though, it wasn’t the new product that generated the huge sales growth but an older model.

Yet there was no doubt about the role that its brand new product had played in persuading customers to buy its older and cheaper version.

Persuasion researchers suggest that when people consider a particular set of choices, they often favour alternatives that are ‘compromise choices’. That is, choices that compromise between what is needed at a minimum and what they could possibly spend at a maximum.

A key factor that often drives compromise choices is price. In the case of the bread-making machine, when customers saw the newer, more expensive product, the original, cheaper product immediately seemed a wiser, more economical and attractive choice in comparison.

Paying attention to what your customers and clients see first can be a very effective way to increase your influence and, subsequently, your business. It is useful to remember that high- end and high-priced products provide two crucial benefits. Firstly, they often serve to meet the needs of customers who are attracted to high-price offerings. A second, and perhaps less recognised benefit is that the next-highest options are often seen as more attractively priced.

Bars and hotels often present wine lists in the order of their cheapest (most often the house wine) first. But doing so might mean that customers may never consider some of the more expensive and potentially more profitable wines towards the end of the list. The ‘compromise’ approach suggests that reversing the order and placing more expensive wines at the top of the list would immediately make the next most expensive wines a more attractive choice — potentially increasing sales.

Original source: http://www.babusinesslife.com/Tools/Persuasion/How-compromise-choices-can-make-you-money.html

Mental Model: Anchoring

We often pay attention to irrelevant information. This happens because we develop estimates by starting with an initial anchor that is based on whatever information is provided and adjust from the anchor (sometimes our adjustments are not sufficient). This is called anchoring.

More problematic perhaps is that the existence of an anchor leads people to think of information consistent with that anchor (commitment and consistency) rather than access information that is inconsistent with that anchor.

Anchoring is commonly observed in real estate and the stock market. Many BUYERS tend to negotiate based on the listed price of a house — and many SELLERS tend to determine the list priced based on adjusting their purchase price.

Some interesting points on anchoring: (1) Experts and non-experts are affected similarly by an anchor; (2) Anchoring-adjustment may occur in any task requiring a numerical response, provided an initial estimate is available; and (3) One study of particular importance for investors, by Joyce and Biddle (1981), found support for the presence of the anchoring effect among practicing auditors of major accounting firms.

* * *

Anchoring and adjustment was first theorized by Tversky and Kahneman. The pair demonstrated that when asked to guess the percentage of African nations which are members of the UN, people who were first asked “was it more or less than 35%” guessed lower values than those who had been asked if it was more or less than 65%. Subjects were biased by the number 45 or 65 and this had a meaningful influence on their judgment. Over time this bias has been shown in numerous experiments. Interestingly, paying participants based on their accuracy did not reduce the magnitude of the anchoring effect.

The power of anchoring can be explained by the confirmation heuristic and by the limitations of our own mind. We selectively access hypothesis-consistent information without realizing it. Availability may also play a role in anchoring.

There are numerous examples of anchoring in everyday life:

  • Children are tracked by schools that categorize them by ability at an early age and based on this initial “anchor” teachers derive expectations. Teachers tend to expect children assigned to the lower group to achieve little and have much higher expectations of children in the top group (for more info see Darley and Gross, 1983). Malcolm Gladwell talks more about anchoring in his book outliers.
  • First impressions are a form of anchoring.
  • Minimum payments on credit card bills.
  • Posted interest rates at Banks.
  • Prices on a menu in restaurants.
  • Race can also be an anchor with respect to our expectations (Duncan, 1976)

* * *

Heuristic and Biases: The Psychology of Intuitive Judgment offers:

“To examine this heuristic, Tversky and Kahneman (1974) developed a paradigm in which participants are given an irrelevant number and asked if the answer to the question is greater or less than that value. After this comparative assessment, participants provide an absolute answer. Countless experiments have shown that people’s absolute answers are influenced by initial comparison with the irrelevant anchor. People estimate that Gandhi lived to be roughly 67 years old, for example, if they first decided whether he died before or after the age of 140, but only 50years old if they first decided whether he died before or after the age of 9.

Anchoring effects have traditionally been interpreted as a result of insufficient adjustment from an irrelevant value, but recent evidence casts doubt on this account. Instead, anchoring effects observed in the standard paradigm appear to be produced by the increased accessibility of anchor consistent information.

* * *

In Judgment and Decision Making, David Hardman says:

Anchoring effects have been observed in a variety of domains including pricing, negotiation, legal judgment, lotteries and gambles, probability estimates, and general knowledge. In one of these studies, Northcraft and Neale (1987) demonstrated anchoring effects in the pricing estimates of estate agents…

Despite the robustness of the anchoring effect, there has been little agreement as to the true nature of the underlying processes. One theory that has been proposed is that of selective anchoring (Mussweilier and Strack, 1997). According to this account, the comparative question task activates information into memory that is subsequently more accessible when making an absolute judgment….

Epley (2004) listed four findings that are consistent with the selective memory account: (1) People attend to shared features between the anchor and target more than to unique features; (2) Completion of a standard anchoring task speeds identification of words consistent with implications of an anchor value rather than words inconsistent with it; (3) The size of anchoring effects can be influenced by altering the hypothesis tested in the comparative assessment (for example, asking whether the anchor is less than a target value has a different effect to asking whether it is more than a target value); (4) People with greater domain knowledge are less susceptible to the effects of irrelevant anchors.

* * *

In Fooled by Randomness, Nicholas Taleb writes:

Anchoring to a number is the reason people do not react to their total wealth, but rather to differences of wealth from whatever number they are currently anchored to. This is in major conflict with economic theory, as according to economists, someone with $1 million in the bank would be more satisfied than if he had $500 thousand but this is not necessarily the case.

* * *

Tversky and Kahneman (1974)

In many situations, people make estimates by starting from an initial value that is adjusted to yield the final answer. The initial value, or starting point, may be suggested by the formulation of the problem, or it may be the result of a partial computation. In either case, adjustments are typically insufficient (Slovic & Lichtenstein, 1971). That is, different starting points yield different estimates, which are biased toward the initial values. We call this phenomenon anchoring.

* * *

Russell Fuller writes:

Psychologists have documented that when people make quantitative estimates, their estimates may be heavily influenced by previous values of the item. For example, it is not an accident that a used car salesman always starts negotiating with a high price and then works down. The salesman is trying to get the consumer anchored on the high price so that when he offers a lower price, the consumer will estimate that the lower price represents a good value. Anchoring can cause investors to under-react to new information.

Anchoring is a Farnam Street Mental Model.

Hindsight Bias: Why You’re Not As Smart As You Think You Are

Hindsight bias occurs when we look backward in time and see events are more predictable than they were at the time a decision was made. This bias, also known as the “knew-it-all-along effect,” typically involves those annoying “I told you so” people who never really told you anything.

For instance, consider driving in the car with your partner and coming to a T in the road. Your partner decides to turn right and 4 miles down the road when you realize you are lost you think “I knew we should have taken that left.”

Hindsight bias can offer a number of benefits in the short run. For instance, it can be flattering to believe that our judgment is better than it actually is. And, of course, hindsight bias allows us to participate in one of our favorite pastimes — criticizing the decisions of others for their lack of foresight.

“Judgments about what is good and what is bad, what is worthwhile and what is a waste of talent, what is useful and what is less so, are judgments that seldom can be made in the present. They can safely be made only by posterity.”

— Tulvings

Aside from helping aid in a more objective reflection of decisions, hindsight bias also has several practical implications. For example, consider someone asked to review a paper but knows the results of the previous review from someone else? Or a physician asked for a second opinion after knowing the results of the first. The results of these actions will likely be biased by some degree. Once we know an outcome it becomes easy to find some plausible explanation.

Hindsight bias helps us become less accountable for our decisions, less critical of ourselves, and over-confident in our ability to make decisions.

One of the most interesting things I discovered when researching hindsight bias was the impact on our legal system and the perceptions of jurors.

* * *

Harvard Professor Max Bazerman offers:

The processes that give rise to anchoring and overconfidence are also at play with the hindsight bias. According to this explanation, knowledge of an event’s outcome works as an anchor by which individuals interpret their prior judgments of the event’s likelihood. Due to the selective accessibility of the confirmatory information during information retrieval, adjustments to anchors are inadequate. Consequently, hindsight knowledge biases our perceptions of what we remember knowing in foresight. Furthermore, to the extent that various pieces of data about the event vary in support of actual outcome, evidence that is consistent with the known outcome may become cognitively more salient and thus more available in memory. This tendency will lead an individual to justify a claimed foresight in view of “the facts provided.” Finally, the relevance of a particular piece of that may later be judged important to the extent to which it is representative of the final observed outcome.

In Cognitive Illusions, Rudiger Pohl offered the following explanations of hindsight bias:

Most prominent among the proposes explanations are cognitive accounts which assume that hindsight bias results from an inability to ignore the solution. Among the early approaches are the following three: (1) Fischhoff (1975) assumed an immediate and irreversible assimilation of the solution into one’s knowledge base. As a consequence, the reconstructed estimate will be biased towards the solution. (2) Tversky and Kahneman (1974) proposed a cognitive heuristic for the anchoring effected, named anchoring and insufficient adjustment. The same mechanism may apply here, if the solution is assumed to serve as an “anchor” in the reconstruction process. The reconstruction starts from this anchor and is then adjusted in the direction of one’s knowledge base. However, this adjustment process may stop too early, for example at the point where the first plausible value is reached, thus ending to a biased reconstruction. (3) Hell (1988) argued that the relative trace strengths of the regional estimate and of the solution might predict the amount of hindsight bias. The stronger the trace strength of the solution relative to that of the original estimate, the larger hindsight bias should be.

Pohl also offers an evolutionary explanation of hindsight bias:

Finally, some authors argued that hindsight bias is not necessarily a bothersome consequence of a “faulty” information process system, but that is may rather represent an unavoidable by-product of an evolutionary evolved function, namely adaptive learning. According to this view, hindsight bias is seen as the consequence of our most valuable ability to update previously held knowledge. This may be seen as a necessary process in order to prevent memory overload and thus to maintain normal cognitive functioning. Besides, updating allows us to keep our knowledge more coherent and to draw better inferences.

Ziva Junda, in social cognition, offers the following explanation of why hindsight bias occurs:

Preceding events take on new meaning and importance as they are made to cohere with the known outcome. Now that we know that our friends have filed for divorce, any ambiguous behavior we have seen is reinterpreted as indicative of tension, any disagreement gains significance, and any signs of affection seem irrelevant. It now seems obvious that their marriage was doomed from the start…Moreover, having adjusted our interpretations in light of current knowledge, it is difficult to imagine how things could have happened differently.

When making likelihood judgments, we often rely on the availability heuristic: The more difficult it is for us to imagine an outcome, the more unlikely it seems. Therefore, the difficulty we experience imagining how things might have turned out differently makes us all the more convinced that the outcomes that did occur were bound to have occurred.

Hindsight bias has large implications for criminal trials. In Jury Selection Hale Starr and Mark McCormick offer the following:

The effects of hindsight bias – which result in being held to a higher standard – are most critical for both criminal and civil defendants. The defense is more susceptible to the hindsight bias since their actions are generally the ones being evaluated fro reasonableness in foresight-foreseeability. When jurors perceive that the results of particular actions were “reasonably” more likely after the outcome is known, defendants are judged as having been capable of knowing more than they knew at the time the action was taken and therefore as capable of preventing the “bad” outcome.

In post-verdict surveys jurors unknowingly demonstrate some of the effects of hindsight bias:

“I can’t understand why the managers didn’t try to get more information or use the information they had available. They should have known there would be safety problems at the plant”.

“The defendants should have known people would remove the safety shield around the tire. There should have been warnings so people wouldn’t do that”

“Even though he was a kid, he should have known that once he showed the others who had been drinking that he had a gun, things would get out of hand. He should have known guns invited violence.

Jurors influenced by the hindsight bias look at the evidence presented and determine that the defendants knew or should have known their actions were unsafe, unwise, or created a dangerous situation. Hindsight bias often results in the judgment that the event was “an accident or tragedy waiting to happen.”

* * *

Protection Against Hindsight Bias

In Principles of Forecasting, Jon Scott Armstrong, offers the following advice on how to protect yourself:

The surest protection against (hindsight bias) is disciplining ourselves to make explicit predictions, showing what we did in fact know (sounds like a decision journal). That record can also provide us with some protection against those individuals who are wont to second guess us, producing exaggerated claims of what we should have known (and perhaps should have told them). If these observers look to this record, it may show them that we are generally less proficient as forecaster than they would like while protecting us against charges of having blown a particular assignment. Having an explicit record can also protect us against overconfidence in our own forecasting ability: If we feel that we “knew all along” what was going to happen, then it is natural enough to think that we will have similar success in the future. Unfortunately, an exaggerated perception of a surprise-free past maybe portend a surprised-full future.

Documenting the reasons we made a forecast makes it possible for us to know not only how well the forecast did, but also where it went astray. For example, subsequent experiences may show that we used wrong (or misunderstood) inputs. In that case, we can, in principle, rerun the forecasting process with better inputs and assess the accuracy of our (retrospectively) revised forecasts. Perhaps we did have the right theory and procedures, but were applying them to a mistaken picture of then-current conditions…Of course inputs are also subject to hindsight bias, hence we need to record them explicitly as well. The essence of making sense out of outcome knowledge is reinterpreting the processes and conditions that produced the reported event.

Hindsight Bias is part of the Farnam Street latticework of mental models.