• Skip to main content
  • Skip to header right navigation
  • Skip to site footer
Farnam Street Logo

Farnam Street

Mastering the best of what other people have already figured out

  • Articles
  • Newsletter
  • Podcast
  • Books
  • Courses
  • Log In
  • Become a Member
TweetEmailLinkedInPrint
Mental Models|Reading Time: 14 minutes

Avoiding Falling Victim to The Narrative Fallacy

The narrative fallacy leads us to see events as stories, with logical chains of cause and effect. Stories help us make sense of the world. However, if we’re not aware of the narrative fallacy it can lead us to believe we understand the world more than we really do.

***

A typical biography starts by describing the subject’s young life, trying to show how the ultimate painting began as just a sketch. In Walter Isaacson’s biography of Steve Jobs, for example, Isaacson illustrates that Jobs’s success was determined to a great degree by the childhood influence of his father. Paul Jobs, a careful, detailed-oriented engineer and craftsman – would carefully craft the backs of fences and cabinets even if no one would see – who Jobs later found out was not his biological father. The combination of his adoption and his craftsman father planted the seeds of Steve’s adult personality: his penchant for design detail, his need to prove himself, his messianic zeal. The recent movie starring Michael Fassbender especially plays up the latter cause; Jobs’s feeling of abandonment drove his success. Fassbender’s emotional portrayal earned him an Oscar nomination.

Nassim Taleb describes a memorable experience of a similar type in his book The Black Swan.

In Rome, Taleb is having an animated discussion with a professor who has read his first book Fooled by Randomness, parts of which promote the idea that our mind creates more cause-and-effect links than reality would support. The professor proceeds to congratulate Taleb on his great luck by being born in Lebanon:

… had you grown up in a Protestant society where people are told that efforts are linked to rewards and individual responsibility is emphasized, you would never have seen the world in such a manner. You were able to see luck and separate cause-and-effect because of your Eastern Orthodox Mediterranean heritage.

These types of stories strike a deep chord: They give us deep, affecting reasons on which to hang our understanding of reality. They help us make sense of our own lives. And, most importantly, they frequently cause us to believe we can predict the future. The problem is, most of them are a sham.

As flattered as he was by the professor’s praise, Nassim Taleb knew instantly that attributing his success to his background was a fallacy:

How do I know that this attribution to the background is bogus? I did my own empirical test by checking how many traders with my background who experienced the same war become skeptical empiricists, and found none out of twenty-six.

The professor who had just praised the idea that we overestimate our ability to understand cause-and-effect couldn’t stop himself from committing the very same error in conversation with Taleb himself.

Steve Jobs felt the same about the idea that his adoption had anything but a coincidental effect on his success:

There’s some notion that because I was abandoned, I worked very hard so I could do well and make my parents wish they had me back, or some such nonsense, but that’s ridiculous […] Knowing I was adopted may have made me feel more independent, but I have never felt abandoned. I’ve always felt special. My parents made me feel special.

The Narrative Fallacy

Such is the power of the Narrative Fallacy — the backward-looking mental tripwire that causes us to attribute a linear and discernable cause-and-effect chain to our knowledge of the past. As Nassim points out, there is a deep biological basis to the problem: we are inundated with so much sensory information that our brains have no other choice; we must put things in order so we can process the world around us. It’s implicit in how we understand the world. When the coffee cup falls, we need to know why it fell. (We knocked it over.) If someone gets the job instead of us, we need to know why they were deemed better. (They had more experience, they were more likeable.) Without a deep search for reasons, we would go around with blinders on, one thing simply happening after another. The world does not make sense without cause-and-effect.

This necessary mental function serves us well, in general. But we also must come to terms with the types of situations where our broadly useful “ordering” function causes us to make errors.

What You Don’t See

We fall for narrative regularly and in a wide variety of contexts. Sports are the most obvious example: Try to recall the last time you watched a profile of a famous athlete — the rise from obscurity, the rags-to-riches story, the dream-turned-reality. How did ESPN portray the success of the athlete?

If it was like most profiles, you’d most likely see some combination of the following: Parents or a coach that pushed him/her to strive for excellence; a natural gift for the sport, or at the very least, a strong inborn athleticism; an impactful life event and/or some form of adversity; and a hard work ethic.

The copy might read as follows,

It was clear from a young age that Steven was destined for greatness. He was taller than his whole class, had skills that none of his peers had, and a mother who never let him indulge laziness or sloth. Losing his father at a young age pushed him to work harder than ever, knowing he’d have to support his family. And once he met someone willing to tutor him, someone like Central High basketball coach Ed Johnson, the future was all but assured — Steven was going to be an NBA player come hell or high-water.

If you were to read this story about how a tall, strong, fast, skilled young man with good coaching and a hard work ethic came to dominate the NBA, would you stop for even a second to question that those were the total causes of his success? If you’re like most people, the answer is no. We hear about similar stories over and over again.

The problem is, these stories are subject to a deep narrative fallacy. Here’s why: think again about the supposed causes of Steven’s success — work ethic, great parents, strong coaching, a formative life event. How many young men in the United States alone have the exact same background and yet failed to achieve their dreams of NBA stardom? The question answers itself: there are probably thousands of them.

That’s the problem with narrative: it lures us into believing that we can explain the past through cause-and-effect when we hear a story that supports our prior beliefs. To take the case of our fictitious basketball player, we have been conditioned to believe that hard work, pushy parents, and coaches, and natural gifts lead to fame and success.

To be clear, many of these factors are contributive in important ways. There aren’t many short, slow guys with no work ethic and bad hand-eye coordination playing in the NBA. And passion goes a long way towards deserved success. But if it’s true that there are a thousand times as many similarly qualified men who do not end up playing professional basketball, then our diagnosis must be massively incomplete. There is no other sensible explanation.

What might we be missing? The list is endless, but some combination of luck, opportunism, and timing must have played into Steven’s and any other NBA player’s success. It is very difficult to understand cause-and-effect chains, and this is a simple example compared to a complex case like a war or an economic crisis, which would have had multiple causes working in a variety of directions and enough red herrings to make a proper analysis difficult.

When it comes to understanding success in basketball, the problem might seem benign. (Although if you’re an NBA executive, maybe not so benign.) But the rest of us deal with it in a practical way all the time. Taleb points out in his book that you could prove how narratives influence our decision making by giving a friend a great detective novel and then, before they got to the final reveal, asking them to give the odds of each individual suspect being the culprit. It’s almost certain that unless you allowed them to write down the odds as they went along, they’d add up to more than 100% — the better the novel, the higher the number. This would be nonsense from a probability standpoint, the odds must add to 100%, but each narrative is so strong that we lose our bearings.

Of course, the more salient the given reasons, the more likely we are to attribute them as causes. By salient, we mean evidence which is mentally vivid and seemingly relevant to the situation — as in the case of the young basketball prodigy losing his father, a common, Disney-made plot point. We cling to these highly available images and they cause us to make avoidable errors in judgment, as with those of us afraid to board an airplane because of the thought of a harrowing airplane crash, one that is statistically unlikely to occur in many thousands of lifetimes.

Less is More

Daniel Kahneman makes this point wonderfully in his book Thinking: Fast, and Slow, in a chapter titled “Linda: Less is More”.

The best-known and most controversial of our experiments involved a fictitious lady called Linda. Amos and I made up the Linda problem to provide conclusive evidence of the role of heuristics in judgment and of their incompatibility with logic. This is how we described Linda:

Linda is thirty-one years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in antinuclear demonstrations.

[…]

In what we later described as an “increasingly desperate” attempt to eliminate the error, we introduced large groups of people to Linda and asked them this simple question:

Which alternative is more probable?
Linda is a bank teller.
Linda is a bank teller and is active in the feminist movement.

This stark version of the problem made Linda famous in some circles, and it earned us years of controversy. About 85% to 90% of undergraduates at several major universities chose the second option, contrary to logic.

This again demonstrates the power of narrative. We are so willing to take a description and mentally categorize the person it describes — in this case, feminist bank teller is much more salient than simply bank teller — that we will violate probabilities and logic in order to uphold our first conclusion. The extra description makes our mental picture much more vivid, and we conclude that the vivid picture must be the correct one. This error very likely contributes to our tendency to stereotype based on limited sample sizes; a remnant of our primate days which probably served us well in a very different environment.

The point is made again by Kahneman as he explains how the corporate performance of companies included in famous business books like In Search of Excellence and Good to Great regressed to the mean after the books were written, a now well-known phenomenon:

You are probably tempted to think of causal explanations for these observations: perhaps the successful firms became complacent, the less successful firms tried harder. But this is the wrong way to think about what happened. The average gap must shrink, because the original gap was due in good part to luck, which contributed both to the success of the top firms and to the lagging performance of the rest. We have already encountered this statistical fact of life: regression to the mean.

Stories of how businesses rise and fall strike a chord with readers by offering what the human mind needs: a simple message of triumph and failure that identifies clear causes and ignores the determinative power of luck and the inevitability of regression. These stories induce and maintain an illusion of understanding, imparting lessons of enduring value to readers who are all too eager to believe them.

What’s the harm, you say? Aren’t we just making our lives a little more interesting with these stories? Very true. Stories serve many wonderful functions: teaching, motivating, inspiring. The problem though is that we too often believe our stories are predictive. We make them more real than they are. The writers of the business case-study books certainly believed that the explanations of success they put forth would be predictive of future success (the title Built to Last certainly implies as much), yet a good many of the companies soon became shells of their former selves – Citigroup, Hewlett Packard, Motorola, and Sony among them.

Is a good corporate culture helpful in generating success? Certainly! As it is with height and NBA success. But it’s far more difficult to determine cause and effect than simply recognizing the no-brainers. Just as many tall, talented, hard-working basketball players have failed to make it, many corporate cultures which met all of the Built to Last structures have subsequently failed. The road to success was simply more complicated than the reductive narrative of the book would allow. Strategic choices, luck, circumstance, and the contributions of specific individual personalities may have all played a role. It’s hard to say. And unless we recognize the Narrative Fallacy for what it is, a simplified and often incorrect view of past causality, we carry an arrogance about our knowledge of the past and its usefulness in predicting the future.

Reason-Respecting Tendency

A close cousin of the narrative fallacy is what Charlie Munger refers to as Reason-Respecting Tendency in Poor Charlie’s Almanack. Here’s how Charlie describes the tendency:

There is in man, particularly one in an advanced culture, a natural love of accurate cognition and a joy in its exercise. This accounts for the widespread popularity of crossword puzzles, other puzzles, and bridge and chess columns, as well as all games requiring mental skill.

This tendency has an obvious implication. It makes man especially prone to learn well when a would-be teacher gives correct reasons for what is being taught, instead of simply laying out the desired belief ex-cathedra with no reasons given. Few practices, therefore, are wiser than not only thinking through reasons before giving orders but also communicating these reasons to the recipient of the order.

[…]

Unfortunately, Reason-Respecting Tendency is so strong that even a person’s giving of meaningless or incorrect reasons will increase compliance with his orders and requests. This has been demonstrated in psychology experiments wherein “compliance practitioners” successfully jump to the head of lines in front of copying machines by explaining their reason: “I have to make some copies.” This sort of unfortunate byproduct of Reason-Respecting Tendency is a conditioned reflex, based on a widespread appreciation of the importance of reasons. And, naturally, the practice of laying out various claptrap reasons is much used by commercial and cult “compliance practitioners” to help them get what they don’t deserve.

The deep structure of the mind is such that stories, reasons, and causes, things that point an arrow in the direction of Why are the ones that stick most deeply. Our need to look for cause-and-effect chains in anything we encounter is simply an extension of our inbuilt pattern recognition software, which can deepen and broaden as we learn new things. It has been shown, for example, that a master chess player cannot remember the pieces on a randomly assembled chessboard any better than a complete novice. But a master chess player can memorize the pieces on a board which represents an actual game in progress. If you take the pieces away, the master can replicate their positions with very high fidelity, whereas a novice cannot. The difference is that the pattern recognition software of the chess player has been developed to a high degree through deliberate practice — they have been in a thousand game situations just like it in the past. And while most of us may not be able to memorize chess games, we all have brains that perform the same function in other contexts.

Taleb hits on the same idea in The Black Swan:

Consider a collection of words glued together to constitute a 500-page book. If the words are purely random, picked up from the dictionary in a totally unpredictable way, you will not be able to summarize, transfer, or reduce the dimensions of that book without losing something significant from it. You need 100,000 words to carry the exact message of a random 100,000 words with you on your next trip to Siberia. Now consider the opposite: a book filled with the repetition of the following sentence: “The chairman of [insert here your company’s name] is a lucky fellow who happened to be in the right place at the right time and claims credit for the company’s success, without making a single allowance for luck,” running ten times per page for 500 pages. The entire book can be accurately compressed, as I have just done, into 34 words (out of 100,000); you could reproduce it with total fidelity out of such a kernel.

If we combine the ideas of Reason-Respecting Tendency and the mind’s deep craving for order, the interesting truth is that the best teaching, learning, and storytelling methods — those involving reasons and narrative, on which our brain can store information in a more useful and efficient way — are also the ones that cause us to make some of the worst mistakes. Our craving for order betrays us.

What Can We Do?

The first step, clearly, is to become aware of the problem. Once we understand our brain’s craving for narrative, we begin to see narratives every day, all the time, especially as we consume news. The key question we must ask ourselves is “Of the population of X subject to the same initial conditions, how many turned out similarly to Y? What hard-to-measure causes might have played a role?” This is what we did when we unraveled Steven’s narrative above. How many kids just like him, with the same stated conditions — tall, skilled, good parents, good coaches, etc. — achieved the same result? We don’t have to run an empirical test to understand that our narrative sense is providing some misleading cause-and-effect. Common sense tells us there are likely to be many more failures than successes in that pool, leading us to understand that there must have been other unrealized factors at play; luck being a crucial one. Some identified factors were necessary but not sufficient — height, talent, and coaching among them — and some factors might have been negligible or even negative. (Would it have helped or hurt Steven’s NBA chances if he had not lost his father? Impossible to say.)

Modern scientific thought is built on just this sort of edifice to solve the cause-and-effect problem. A thousand years ago, much of what we thought we knew was based on naïve backward-looking causality. (Steve put leeches on his skin and then survived the plague = leeches cure the plague.) Only when we learned to take the concept of [leeches = cure for plague] and call it a hypothesis did we begin to understand the physical world. Only by downgrading our naïve assumptions to the status of a hypothesis, which needs to be tested with rigorous experiment – give 100 plague victims leeches and let 100 of them go leech-less and tally the results – did we find a method to parse actual cause and effect.

And it is just as relevant to ask ourselves the inverse of the question posed above: “Of the population not subject to X, how many still ended up with the results of Y?” This is where we ask: which basketball players had intact families, easy childhoods, and, yet ended up in the NBA anyway? Which corporations lacked the traits described in Good to Great but achieved Greatness anyway? When we are willing to ask both types of questions and try our best to answer them, we can start to see which elements are simply part of the story rather than causal contributors.

A second way we can circumvent narrative is to simply avoid or reinterpret sources of information most subject to the bias. Turn the TV news off. Stop reading so many newspapers. Be skeptical of biographies, memoirs, and personal histories. Be careful of writers who are incredibly talented at painting a narrative, but claim to be writing facts. (I love Malcolm Gladwell’s books, but he would be an excellent example here.) We learned above that narrative is so powerful it can overcome basic logic, so we must be rigorous to some extent about what kinds of information we allow to pass through our filters. Strong narrative flow is exactly why we enjoy a fictional story, but when we enter the non-fiction world of understanding and decision making, the power of narrative is not always on our side. We want to use narrative to our advantage — to teach ourselves or others useful concepts — but be wary of where it can mislead.

One way to assess how narrative affects your decision-making is to start keeping a journal of your decisions or predictions in any arena that you consider important. It’s important to note the why behind your prediction or decision. If you’re going to invest in your cousin’s new social media startup — sure to succeed — explain to yourself exactly why you think it will work. Be detailed. Whether the venture succeeds or fails, you will now have a forward-looking document to refer to later, so that when you have the benefit of hindsight, you can evaluate your original assumptions instead of finding convenient reasons to justify the success or failure. The more you’re able to do this exercise, the more you’ll come to understand how complicated cause-and-effect factors are when we look ahead rather than behind.

Munger also gives us a few prescriptions in Poor Charlie’s Almanack after describing the Availability Bias, another kissing cousin of the Narrative Fallacy and the Reason-Respecting Tendency. We are wise to take heed of them as we approach our fight with narrative:

The main antidote to miscues from the Availability-Misweighing Tendency often involves procedures, including use of checklists, which are almost always helpful.

Another antidote is to behave somewhat like Darwin did when he emphasized disconfirming evidence. What should be done is to especially emphasize factors that don’t produce reams of easily available numbers, instead of drifting mostly or entirely into considering factors that do produce such numbers. Still another antidote is to find and hire some skeptical, articulate people with far-reaching minds to act as advocates for notions that are opposite to the incumbent notions. [Ed: Or simply ask a smart friend to do the same.]

One consequence of this tendency is that extra-vivid evidence, being so memorable and thus more available in cognition, should often consciously be under weighed while less vivid evidence should be overweighed.

Munger’s prescriptions are almost certainly as applicable to solving the narrative problem as the close-related Availability problem, especially the issue we discussed earlier of vivid evidence.

Lastly, the final prescription comes from Taleb himself; the progenitor of the idea of our problem with narrative: when searching for real truth, favor experimentation over storytelling (data over anecdote), favor experience over history (which can be cherry-picked), and favor clinical knowledge over grand theories. Figure out what you know and what’s a guess, and become humble about your understanding of the past.

This recognition and respect of the power of our minds to invent and love stories can help us reduce our misunderstanding of the world.

Read Next

Next Post:An Important Life Lesson we can Learn from SailorsA quick yet incredibly important quote today that adds to the wisdom of Andy Benoit and Joseph Tussman. … [W]e should not try to alter …

Discover What You’re Missing

Get the weekly email full of actionable ideas and insights you can use at work and home.


As seen on:

Forbes logo
New York Times logo
Wall Street Journal logo
The Economist logo
Financial Times logo
Farnam Street Logo

© 2023 Farnam Street Media Inc. All Rights Reserved.
Proudly powered by WordPress. Hosted by Pressable. See our Privacy Policy.

  • Speaking
  • Sponsorship
  • About
  • Support
  • Education

We’re Syrus Partners.
We buy amazing businesses.


Farnam Street participates in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising commissions by linking to Amazon.