Tag: Insensitivity to base rates

Survivorship Bias: The Tale of Forgotten Failures

Survivorship bias is a common logical error that distorts our understanding of the world. It happens when we assume that success tells the whole story and when we don’t adequately consider past failures.

There are thousands, even tens of thousands of failures for every big success in the world. But stories of failure are not as sexy as stories of triumph, so they rarely get covered and shared. As we consume one story of success after another, we forget the base rates and overestimate the odds of real success.

“See,” says he, “you who deny a providence, how many have been saved by their prayers to the Gods.”

“Ay,” says Diagoras, “I see those who were saved, but where are those painted who were shipwrecked?”

— Cicero

The Basics

A college dropout becomes a billionaire. Batuli Lamichhane, a chain-smoker, lives to the age of 118. Four young men are rejected by record labels and told “guitar groups are on the way out,” then go on to become the most successful band in history.

Bill Gates, Batuli Lamichhane, and the Beatles are oft-cited examples of people who broke the rules without the expected consequences. We like to focus on people like them—the result of a cognitive shortcut known as survivorship bias.

When we only pay attention to those who survive, we fail to account for base rates and end up misunderstanding how selection processes actually work. The base rate is the probability of a given result we can expect from a sample, expressed as a percentage. If you play roulette, for example, you can be expected to win one out of 38 games, or 2.63%, which is the base rate. The problem arises when we mistake the winners for the rule and not the exception. People like Gates, Lamichhane, and the Beatles are anomalies at one end of a distribution curve. While there is much to learn from them, it would be a mistake to expect the same results from doing the same things.

A stupid decision that works out well becomes a brilliant decision in hindsight.

— Daniel Kahneman

Cause and Effect

Can we achieve anything if we try hard enough? Not necessarily. Survivorship bias leads to an erroneous understanding of cause and effect. People see correlation in mere coincidence. We all love to hear stories of those who beat the odds and became successful, holding them up as proof that the impossible is possible. We ignore failures in pursuit of a coherent narrative about success.

Few would think to write the biography of a business person who goes bankrupt and spends their entire life in debt. Or a musician who tried again and again to get signed and was ignored by record labels. Or of someone who dreams of becoming an actor, moves to LA, and ends up returning a year later, defeated and broke. After all, who wants to hear that? We want the encouragement survivorship bias provides, and the subsequent belief in our own capabilities. The result is an inflated idea of how many people become successful.

The discouraging fact is that success is never guaranteed. Most businesses fail. Most people do not become rich or famous. Most leaps of faith go wrong. It does not mean we should not try, just that we should be realistic with our understanding of reality.

Beware of advice from the successful.

— Barnaby James

Survivorship Bias in Business

Survivorship bias is particularly common in the world of business. Companies which fail early on are ignored, while the rare successes are lauded for decades. Studies of market performance often exclude companies which collapse. This can distort statistics and make success seem more probable than it truly is. Just as history is written by the winners, so is much of our knowledge about business. Those who end up broke and chastened lack a real voice. They may be blamed for their failures by those who ignore the role coincidence plays in the upward trajectories of the successful.

Nassim Taleb writes of our tendency to ignore the failures: “We favor the visible, the embedded, the personal, the narrated, and the tangible; we scorn the abstract.” Business books laud the rule-breakers who ignore conventional advice and still create profitable enterprises. For most entrepreneurs, taking excessive risks and eschewing all norms is an ill-advised gamble. Many of the misfit billionaires who are widely celebrated succeeded in spite of their unusual choices, not because of them. We also ignore the role of timing, luck, connections and socio-economic background. A person from a prosperous family, with valuable connections, who founds a business at a lucrative time has a greater chance of survival, even if they drop out of college or do something unconventional. Someone with a different background, acting at an inopportune time, will have less of a chance.

In No Startup Hipsters: Build Scalable Technology Companies, Samir Rath and Teodora Georgieva write:

Almost every single generic presentation for startups starts with “Ninety Five percent of all startups fail”, but very rarely do we pause for a moment and think “what does this really mean?” We nod our heads in somber acknowledgement and with great enthusiasm turn to the heroes who “made it” — Zuckerberg, Gates, etc. to absorb pearls of wisdom and find the Holy Grail of building successful companies. Learning from the successful is a much deeper problem and can reduce the probability of success more than we might imagine.

Examining the lives of successful entrepreneurs teaches us very little. We would do far better to analyze the causes of failure, then act accordingly. Even better would be learning from both failures and successes.

Focusing on successful outliers does not account for base rates. As Rath and Georgieva go on to write:

After any process that picks winners, the non-survivors are often destroyed or hidden or removed from public view. The huge failure rate for start-ups is a classic example; if failures become invisible, not only do we fail to recognise that missing instances hold important information, but we may also fail to acknowledge that there is any missing information at all.

They describe how this leads us to base our choices on inaccurate assumptions:

Often, as we revel in stories of start-up founders who struggled their way through on cups of ramen before the tide finally turned on viral product launches, high team performance or strategic partnerships, we forget how many other founders did the same thing, in the same industry and perished…The problem we mention is compounded by biographical or autobiographical narratives. The human brain is obsessed with building a cause and effect narrative. The problem arises when this cognitive machinery misfires and finds patterns where there are none.

These success narratives are created both by those within successful companies and those outside. Looking back on their ramen days, founders may believe they had a plan all along. They always knew everything would work out. In truth, they may lack an idea of the cause and effect relationships underlying their progress. When external observers hear their stories, they may, in a quasi-superstitious manner, spot “signs” of the success to come. As Daniel Kahneman has written, the only true similarity is luck.

Consider What You Don’t See

When we read about survivorship bias, we usually come across the archetypical story of Abraham Wald, a statistician studying World War II airplanes. His research group at Columbia University was asked to figure out how to better protect airplanes from damage. The initial approach to the problem was to look at the planes coming back, seeing where they were hit the worst, then reinforcing that area.

However, Wald realized there was a missing, yet valuable, source of evidence: Planes that were hit that did not make it back. Planes that went down, that weren’t surviving, had much better information to provide on areas that were most important to reinforce. Wald’s approach is an example of how to overcome survivorship bias. Don’t look just at what you can see. Consider all the things that started on the same path but didn’t make it. Try to figure out their story, as there is as much, if not more, to be learned from failure.

Considering survivorship bias when presented with examples of success is difficult. It is not instinctive to pause, reflect, and think through what the base rate odds of success are and whether you’re looking at an outlier or the expected outcome. And yet if you don’t know the real odds, if you don’t know if what you’re looking at is an example of survivorship bias, then you’ve got a blind spot.

Whenever you read about a success story in the media, think of all the people who tried to do what that person did and failed. Of course, understanding survivorship bias isn’t an excuse for not taking action, but rather an essential tool to help you cut through the noise and understand the world. If you’re going to do something, do it fully informed.

To learn more, consider reading Fooled By Randomness, or The Art of Thinking Clearly.

Countering the Inside View and Making Better Decisions

When we don’t think about the process we use to make decisions, they tend to get worse over time as we fail to learn from experience. Often, we make decisions based on the information that is easiest to access. Let’s learn how to take the outside view and make better decisions.

Countering the Inside View And Making Better Decisions

You can reduce the number of mistakes you make by thinking about problems more clearly.

In his book Think Twice: Harnessing the Power of Counterintuition, Michael Mauboussin discusses how we can “fall victim to simplified mental routines that prevent us from coping with the complex realities inherent in important judgment calls.” One of those routines is the inside view, which we’re going to talk about in this article but first let’s get a bit of context.

No one wakes up thinking, “I am going to make bad decisions today.” Yet we all make them. What is particularly surprising is some of the biggest mistakes are made by people who are, by objective standards, very intelligent. Smart people make big, dumb, and consequential mistakes.

[…]

Mental flexibility, introspection, and the ability to properly calibrate evidence are at the core of rational thinking and are largely absent on IQ tests. Smart people make poor decisions because they have the same factory settings on their mental software as the rest of us, and that software isn’t designed to cope with many of today’s problems.

We don’t spend enough time thinking and learning from the process. Generally, we’re pretty ambivalent about the process by which we make decisions.

… typical decision makers allocate only 25 percent of their time to thinking about the problem properly and learning from experience. Most spend their time gathering information, which feels like progress and appears diligent to superiors. But information without context is falsely empowering.

That reminds me of what Daniel Kahneman wrote in Thinking, Fast and Slow:

A remarkable aspect of your mental life is that you are rarely stumped … The normal state of your mind is that you have intuitive feelings and opinions about almost everything that comes your way. You like or dislike people long before you know much about them; you trust or distrust strangers without knowing why; you feel that an enterprise is bound to succeed without analyzing it.

Context comes from broad understanding — looking at the problem from the outside in and not the inside out. When we make a decision, we’re not really gathering and contextualizing information as much as trying to satisfice our existing intuition; The very thing a good decision process should help root out. Think about it this way, every time you make a decision, you’re saying you understand something. Most of us stop there. But understanding is not enough, you need to test that your understanding is correct which comes through feedback and reflection. Then you need to update your understanding. This is the learning loop.

So why are we so quick to assume we understand?

Ego Induced Blindness

We tend to favor the inside view over the outside view.

An inside view considers a problem by focusing on the specific task and by using information that is close at hand, and makes predictions based on that narrow and unique set of inputs. These inputs may include anecdotal evidence and fallacious perceptions. This is the approach that most people use in building models of the future and is indeed common for all forms of planning.

[…]

The outside view asks if there are similar situations that can provide a statistical basis for making a decision. Rather than seeing a problem as unique, the outside view wants to know if others have faced comparable problems and, if so, what happened. The outside view is an unnatural way to think, precisely because it forces people to set aside all the cherished information they have gathered.

When the inside view is more positive than the outside view you effectively have a base rate argument. You’re saying (knowingly or, more likely, unknowingly) that this time is different. Our brains are all too happy to help us construct this argument.

Mauboussin argues that we embrace the inside view for a few primary reasons. First, we’re optimistic by nature. Second, is the “illusion of optimism” (we see our future as brighter than that of others). Finally, is the illusion of control (we think that chance events are subject to our control).

One interesting point is that while we’re bad at looking at the outside view when it comes to ourselves, we’re better at it when it comes to other people.

In fact, the planning fallacy embodies a broader principle. When people are forced to look at similar situations and see the frequency of success, they tend to predict more accurately. If you want to know how something is going to turn out for you, look at how it turned out for others in the same situation. Daniel Gilbert, a psychologist at Harvard University, ponders why people don’t rely more on the outside view, “Given the impressive power of this simple technique, we should expect people to go out of their way to use it. But they don’t.” The reason is most people think of themselves as different, and better, than those around them.

So it’s mostly ego. I’m better than the people tackling this problem before me. We see the differences between situations and use those as rationalizations as to why things are different this time.

Consider this:

We incorrectly think that differences are more valuable than similarities.

After all, anyone can see what’s the same but it takes true insight to see what’s different, right? We’re all so busy trying to find differences that we forget to pay attention to what is the same.

Incorporating the Outside View

In Think Twice, Mauboussin distills the work of Kahneman and Tversky into four steps and adds some commentary.

1. Select a Reference Class

Find a group of situations, or a reference class, that is broad enough to be statistically significant but narrow enough to be useful in analyzing the decision that you face. The task is generally as much art as science, and is certainly trickier for problems that few people have dealt with before. But for decisions that are common—even if they are not common for you— identifying a reference class is straightforward. Mind the details. Take the example of mergers and acquisitions. We know that the shareholders of acquiring companies lose money in most mergers and acquisitions. But a closer look at the data reveals that the market responds more favorably to cash deals and those done at small premiums than to deals financed with stock at large premiums. So companies can improve their chances of making money from an acquisition by knowing what deals tend to succeed.

2. Assess the distribution of outcomes.

Once you have a reference class, take a close look at the rate of success and failure. … Study the distribution and note the average outcome, the most common outcome, and extreme successes or failures.

[…]

Two other issues are worth mentioning. The statistical rate of success and failure must be reasonably stable over time for a reference class to be valid. If the properties of the system change, drawing inference from past data can be misleading. This is an important issue in personal finance, where advisers make asset allocation recommendations for their clients based on historical statistics. Because the statistical properties of markets shift over time, an investor can end up with the wrong mix of assets.

Also keep an eye out for systems where small perturbations can lead to large-scale change. Since cause and effect are difficult to pin down in these systems, drawing on past experiences is more difficult. Businesses driven by hit products, like movies or books, are good examples. Producers and publishers have a notoriously difficult time anticipating results, because success and failure is based largely on social influence, an inherently unpredictable phenomenon.

3. Make a prediction.

With the data from your reference class in hand, including an awareness of the distribution of outcomes, you are in a position to make a forecast. The idea is to estimate your chances of success and failure. For all the reasons that I’ve discussed, the chances are good that your prediction will be too optimistic.

Sometimes when you find the right reference class, you see the success rate is not very high. So to improve your chance of success, you have to do something different than everyone else.

4. Assess the reliability of your prediction and fine-tune.

How good we are at making decisions depends a great deal on what we are trying to predict. Weather forecasters, for instance, do a pretty good job of predicting what the temperature will be tomorrow. Book publishers, on the other hand, are poor at picking winners, with the exception of those books from a handful of best-selling authors. The worse the record of successful prediction is, the more you should adjust your prediction toward the mean (or other relevant statistical measure). When cause and effect is clear, you can have more confidence in your forecast.

***

The main lesson we can take from this is that we tend to focus on what’s different whereas the best decisions often focus on just the opposite: what’s the same. While this situation seems a little different, it’s almost always the same.

As Charlie Munger has said: “if you notice, the plots are very similar. The same plot comes back time after time.”

Particulars may vary but, unless those particulars are the variables that govern the outcome of the situation, the pattern remains. If we’re going to focus on what’s different rather than what’s the same, you’d best be sure the variables you’re clinging to matter.

Fooled By Randomness: My Notes

fooled by randomness

I loved Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets by Nassim Taleb. This is the first popular book he wrote, the book that helped propel him into an intellectual celebrity. Interestingly, Fooled by Randomness contains semi-explored gems of the ideas that would later go on to become the best-selling books The Black Swan and Antifragile.

Here are some of my notes from the book.

Hindsight Bias

Part of the argument that Fooled by Randomness presents is that when we look back at things that have happened we see them as less random than they actually were.

It is as if there were two planets: the one in which we actually live and the one, considerably more deterministic, on which people are convinced we live. It is as simple as that: Past events will always look less random than they were (it is called the hindsight bias). I would listen to someone’s discussion of his own past realizing that much of what he was saying was just backfit explanations concocted ex post by his deluded mind.

The Courage of Montaigne

Writing on Montaigne as the role model for the modern thinker, Taleb also addresses his courage:

It certainly takes bravery to remain skeptical; it takes inordinate courage to introspect, to confront oneself, to accept one’s limitations— scientists are seeing more and more evidence that we are specifically designed by mother nature to fool ourselves.

Probability

Fooled by Randomness is about probability, not in a mathematical way but as skepticism.

In this book probability is principally a branch of applied skepticism, not an engineering discipline. …

Probability is not a mere computation of odds on the dice or more complicated variants; it is the acceptance of the lack of certainty in our knowledge and the development of methods for dealing with our ignorance. Outside of textbooks and casinos, probability almost never presents itself as a mathematical problem or a brain teaser. Mother nature does not tell you how many holes there are on the roulette table , nor does she deliver problems in a textbook way (in the real world one has to guess the problem more than the solution).

Outside of textbooks and casinos, probability almost never presents itself as a mathematical problem” which is fascinating given how we tend to solve problems. In decisions under uncertainty, I discussed how risk and uncertainty are different things, which creates two types of ignorance.

Most decisions are not risk-based, they are uncertainty-based and you either know you are ignorant or you have no idea you are ignorant. There is a big distinction between the two. Trust me, you’d rather know you are ignorant.

Randomness Disguised as Non-Randomness

The core of the book is about luck that we understand as skill or “randomness disguised as non-randomness (that is determinism).”

This problem manifests itself most frequently in the lucky fool, “defined as a person who benefited from a disproportionate share of luck but attributes his success to some other, generally very precise, reason.”

Such confusion crops up in the most unexpected areas, even science, though not in such an accentuated and obvious manner as it does in the world of business. It is endemic in politics, as it can be encountered in the shape of a country’s president discoursing on the jobs that “he” created, “his” recovery, and “his predecessor’s” inflation.

These lucky fools are often fragilistas — they have no idea they are lucky fools. For example:

[W]e often have the mistaken impression that a strategy is an excellent strategy, or an entrepreneur a person endowed with “vision,” or a trader a talented trader, only to realize that 99.9% of their past performance is attributable to chance, and chance alone. Ask a profitable investor to explain the reasons for his success; he will offer some deep and convincing interpretation of the results. Frequently, these delusions are intentional and deserve to bear the name “charlatanism.”

This does not mean that all success is luck or randomness. There is a difference between “it is more random than we think” and “it is all random.”

Let me make it clear here : Of course chance favors the prepared! Hard work, showing up on time, wearing a clean (preferably white) shirt, using deodorant, and some such conventional things contribute to success— they are certainly necessary but may be insufficient as they do not cause success. The same applies to the conventional values of persistence, doggedness and perseverance: necessary, very necessary. One needs to go out and buy a lottery ticket in order to win. Does it mean that the work involved in the trip to the store caused the winning? Of course skills count, but they do count less in highly random environments than they do in dentistry.

No, I am not saying that what your grandmother told you about the value of work ethics is wrong! Furthermore, as most successes are caused by very few “windows of opportunity,” failing to grab one can be deadly for one’s career. Take your luck!

That last paragraph connects to something Charlie Munger once said: “Really good investment opportunities aren’t going to come along too often and won’t last too long, so you’ve got to be ready to act. Have a prepared mind.

Taleb thinks of success in terms of degrees, so mild success might be explained by skill and labor but outrageous success “is attributable variance.”

Luck Makes You Fragile

One thing Taleb hits on that really stuck with me is that “that which came with the help of luck could be taken away by luck (and often rapidly and unexpectedly at that). The flipside, which deserves to be considered as well (in fact it is even more of our concern), is that things that come with little help from luck are more resistant to randomness.” How Antifragile.

Taleb argues this is the problem of induction, “it does not matter how frequently something succeeds if failure is too costly to bear.”

Noise and Signal

We are confused between noise and signal.

…the literary mind can be intentionally prone to the confusion between noise and meaning, that is, between a randomly constructed arrangement and a precisely intended message. However, this causes little harm; few claim that art is a tool of investigation of the Truth— rather than an attempt to escape it or make it more palatable. Symbolism is the child of our inability and unwillingness to accept randomness; we give meaning to all manner of shapes; we detect human figures in inkblots.

All my life I have suffered the conflict between my love of literature and poetry and my profound allergy to most teachers of literature and “critics.” The French thinker and poet Paul Valery was surprised to listen to a commentary of his poems that found meanings that had until then escaped him (of course, it was pointed out to him that these were intended by his subconscious).

If we’re concerned about situations where randomness is confused with non-randomness should we also be concerned with situations where non-randomness is mistaken for randomness, which would result in the signal being ignored?

First, I am not overly worried about the existence of undetected patterns. We have been reading lengthy and complex messages in just about any manifestation of nature that presents jaggedness (such as the palm of a hand, the residues at the bottom of Turkish coffee cups, etc.). Armed with home supercomputers and chained processors, and helped by complexity and “chaos” theories, the scientists, semiscientists, and pseudoscientists will be able to find portents. Second, we need to take into account the costs of mistakes; in my opinion, mistaking the right column for the left one is not as costly as an error in the opposite direction. Even popular opinion warns that bad information is worse than no information at all.

If you haven’t yet, pick up a copy of Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets. Don’t make the same mistake I did and wait to read this important book.

Footnotes

Just In Time Information Gathering

We’re becoming more like factories.

Just-in-time is a production strategy aimed at, among other things, reducing the need for excess inventory. Parts are supplied only when needed in the amount needed. While it makes a business more capital efficient, it also makes it more fragile.

We’ve adopted a similar strategy for information gathering. We’re so consumed by noise and busy work that the only time we really seek out signal is when we need it the most: right before we make a decision.

This creates a host of problems.

The worst time to look for information is when we need it to make a decision. When we do that we’re more likely to see what’s unique and miss the historical context. We’re also more likely to be biased by what is available. And searching for information at the time of need is an indication that you have no idea what you’re doing.

“Pattern recognition,” says Alice Schroeder, “creates an impulse always to connect new knowledge to old and to primarily be interested in new knowledge that genuinely builds on the old.” It helps knowledge snowball.

If we can’t connect the current situation to something we already understand, we might reason that it is not in our circle of competence and thus we shouldn’t be drawing conclusions. If we can, however, connect it to something we previously understood then we’re less likely to draw conclusions on the basis of “this time is different.”

There is merit to thinking about information gathering as a logistics problem.

This Time is Different: The Four Most Costly Words Ever Spoken

When we look at situations we’re always looking for what’s unique. We should, however, give more thought to similarities.

“This time is different” could be the 4 most costly words ever spoken. It’s not the words that are costly so much as the conclusions they encourage us to draw.

We incorrectly think that differences are more valuable than similarities.

After all, anyone can see what’s the same but it takes true insight to see what’s different, right? We’re all so busy trying to find differences that we forget to pay attention to what is the same.

Imagine sitting in a meeting where people are about to make the same mistake they made last year on the same decision. Let’s say, for example, Jack has a history of messing up the annual tax returns. He’s a good guy. He’s nice to everyone. In fact, he buys everyone awesome Christmas presents. But the last three years—the period he’s been in charge of tax returns—have been nothing short of a disaster causing more work for you and your department.

The assignment for the tax return comes up and Jack is once again nominated.

Before you have a chance to voice your concerns, one of your co-workers speaks up: “I know Jack has dropped the ball on this assignment in the past but I think this time is different. He’s been working hard to make sure he’s better organized.”

That’s all it takes. Conversation over — everyone is focused on what’s unique about this time and it’s unlikely, despite ample evidence, that you’ll be able to convince them otherwise.

In part, people want to believe in Jack because he’s a nice guy. In part, we’re all focused on why this time is different and we’ll ignore evidence to the contrary.

Focusing on what’s different makes it easy to forget historical context. We lose touch with the base rate. We only see the evidence that supports our view (confirmation bias). We become optimistic and overconfident.

History provides context. And what history shows us is that no matter how unique things are today there are a lot of similarities with the past.

Consider investors and the dotcom bubble. Collectively people saw this as unprecedented and unique, a massive transformation that was unparalleled.

That reasoning, combined with a blindness to what was the same about this situation and previous ones, encouraged us to draw conclusions that proved costly. We reasoned that everything would change and everyone who owned internet companies would prosper and the old non-internet companies would quickly go into bankruptcy.

All of a sudden profits didn’t matter. Nor did revenue. They would come in time, we hoped. Market share mattered no matter how costly it was to acquire.

More than that, if you didn’t buy now you’d miss out. These companies would take over the world and you’d be left out.

We got so caught up in what was different that we forgot to see what was the same.

And there were historical parallels: Automobiles, Radio, Television, and Airplanes to name a few. At the time these innovations completely transformed the world as well. You can consider them the dotcoms of yesteryear.

And how did these massively transformational industries end up for investors?

At one time there were allegedly over 70 different auto manufacturing operations in the United States. Only 3 of them survived (and a few of those even required government funds.)

If you catch yourself reasoning based on “this time is different” remember that you are probably speculating. While you may be right, odds are, this time is not different. You just haven’t looked for the similarities.

Avoiding Ignorance

This is a continuation of two types of ignorance.

You can’t deal with ignorance if you can’t recognize its presence. If you’re suffering from primary ignorance, it means you probably failed to consider the possibility of being ignorant or you found ways not to see that you were ignorant.

You’re ignorant and unaware, which is worse than being ignorant and aware.

The best way to avoid this suggests Joy and Zeckhauser, is to raise self-awareness.

Ask yourself regularly: “Might I be in a state of consequential ignorance here?”

They continue:

If the answer is yes, the next step should be to estimate base rates. That should also be the next step if the starting point is recognized ignorance.

Of all situations such as this, how often has a particular outcome happening? Of course, this is often totally subjective.

and its underpinnings are elusive. It is hard to know what the sample of relevant past experiences has been, how to draw inferences from the experience of others, etc. Nevertheless, it is far better to proceed to an answer, however tenuous, than to simply miss (primary ignorance) or slight (recognized ignorance) the issue. Unfortunately, the assessment of base rates is challenging and substantial biases are likely to enter.

When we don’t recognize ignorance, the base rate is extremely underestimated. When we do recognize ignorance, we face “duelling biases; some will lead to underestimates of base rates and others to overestimates.”

Three biases come into play while estimating base rates: overconfidence, salience, and selection biases.

So we are overconfident in our estimates. We estimate things that are salient – that is, “states with which (we) have some experience or that are otherwise easily brought to mind.” And “there is a strong selection bias to recall or retell events that were surprising or of great consequence.”

Our key lesson is that as individuals proceed through life, they should always be on the lookout for ignorance. When they do recognize it, they should try to assess how likely they are to be surprised—in other words, attempt to compute the base rate. In discussing this assessment, we might also employ the term “catchall” from statistics, to cover the outcomes not specifically addressed.

It’s incredibly interesting to view literature through the lens of human decision making.

Crime and Punishment is particularly interesting as a study of primary ignorance. Raskolnikov deploys his impressive intelligence to plan the murder, believing, in his ignorance, that he has left nothing to chance. In a series of descriptions not for the squeamish or the faint-hearted, the murderer’s thoughts are laid bare as he plans the deed. We read about his skills in strategic inference and his powers of prediction about where and how he will corner his victim; his tactics at developing complementary skills (what is the precise manner in which he will carry the axe?; what strategies will help him avoid detection) are revealed.

But since Raskolnikov is making decisions under primary ignorance, his determined rationality is tightly “bounded.” He “construct[s] a simplified model of the real situation in order to deal with it; … behaves rationally with respect to this model, [but] such behavior is not even approximately optimal with respect to the real world” (Simon 1957). The second-guessing, fear, and delirium at the heart of Raskolnikov’s thinking as he struggles to gain a foothold in his inner world show the impact of a cascade of Consequential Amazing Development’s (CAD), none predicted, none even contemplated. Raskolnikov anticipated an outcome in which he would dispatch the pawnbroker and slip quietly out of her apartment. He could not have possibly predicted that her sister would show up, a characteristic CAD that challenges what Taleb (2012) calls our “illusion of predictability.”

Joy and Zeckhauser argue we can draw two conclusions.

First, we tend to downplay the role of unanticipated events, preferring instead to expect simple causal relationships and linear developments. Second, when we do encounter a CAD, we often counter with knee-jerk, impulsive decisions, the equivalent of Raskolnikov committing a second impetuous murder.

References: Ignorance: Lessons from the Laboratory of Literature (Joy and Zeckhauser).

12