• Skip to main content
  • Skip to header right navigation
  • Skip to site footer
Farnam Street Logo

Farnam Street

Mastering the best of what other people have already figured out

  • Articles
  • Newsletter
  • Podcast
  • Books
  • Courses
  • Log In
  • Become a Member
TweetEmailLinkedInPrint
Probability|Reading Time: 3 minutes

The Lucretius Problem: How History Blinds Us

The Lucretius Problem is a mental defect where we assume the worst-case event that has happened is the worst-case event that can happen. In so doing, we fail to understand that the worst event that has happened in the past surpassed the worst event that came before it. Only the fool believes all he can see is all there is to see.

***

It’s always good to re-read books and to dip back into them periodically. When reading a new book, I often miss out on crucial information (especially books that are hard to categorize with one descriptive sentence). When you come back to a book after reading hundreds of others, you can’t help but make new connections with the old book and see it anew. The book hasn’t changed, but you have.

It has been a while since I read Anti-fragile. In the past, I’ve talked about an Antifragile Way of Life, Learning to Love Volatility, the Definition of Antifragility, and the Noise and the Signal.

But upon re-reading Antifragile, I came across the Lucretius Problem, and I thought I’d share an excerpt. (Titus Lucretius Carus was a Roman poet and philosopher, best known for his poem On the Nature of Things).

In Antifragile, Nassim Taleb writes:

Indeed, our bodies discover probabilities in a very sophisticated manner and assess risks much better than our intellects do. To take one example, risk management professionals look in the past for information on the so-called worst-case scenario and use it to estimate future risks – this method is called “stress testing.” They take the worst historical recession, the worst war, the worst historical move in interest rates, or the worst point in unemployment as an exact estimate for the worst future outcome​. But they never notice the following inconsistency: this so-called worst-case event, when it happened, exceeded the worst [known] case at the time.

I have called this mental defect the Lucretius problem, after the Latin poetic philosopher who wrote that the fool believes that the tallest mountain in the world will be equal to the tallest one he has observed. We consider the biggest object of any kind that we have seen in our lives or hear about as the largest item that can possibly exist. And we have been doing this for millennia.

Taleb brings up an interesting point, which is that our documented history can blind us. All we know is what we have been able to record. There is an uncertainty that we don’t seem to grasp.

We think because we have sophisticated data collecting techniques that we can capture all the data necessary to make decisions. We think we can use our current statistical techniques to draw historical trends using historical data without acknowledging the fact that past data recorders had fewer tools to capture the dark figure of unreported data. We also overestimate the validity of what has been recorded before, and thus the trends we draw might tell a different story if we had the dark figure of unreported data.

Taleb continues:

The same can be seen in the Fukushima nuclear reactor, which experienced a catastrophic failure in 2011 when a tsunami struck. It had been built to withstand the worst past historical earthquake, with the builders not imagining much worse— and not thinking that the worst past event had to be a surprise, as it had no precedent. Likewise, the former chairman of the Federal Reserve, Fragilista Doctor Alan Greenspan, in his apology to Congress offered the classic “It never happened before.” Well, nature, unlike Fragilista Greenspan, prepares for what has not happened before, assuming worse harm is possible.

Dealing with Uncertainty

Taleb provides an answer, which is to develop layers of redundancy, that is, a margin of safety, to act as a buffer against oneself. We overvalue what we have recorded and assume it tells us the worst and best possible outcomes. Redundant layers are a buffer against our tendency to think what has been recorded is a map of the whole terrain. An example of a redundant feature could be a rainy day fund that acts as an insurance policy against something catastrophic such as a job loss that allows you to survive and fight another day.

Antifragile is a great book to read, to learn something about yourself and the world.

Read Next

Next Post:Bias from Overconfidence: A Mental Model“What a Man wishes, he will also believe” – Demosthenes Bias from overconfidence is a natural human state. All of us believe good …

Discover What You’re Missing

Get the weekly email full of actionable ideas and insights you can use at work and home.


As seen on:

Forbes logo
New York Times logo
Wall Street Journal logo
The Economist logo
Financial Times logo
Farnam Street Logo

© 2023 Farnam Street Media Inc. All Rights Reserved.
Proudly powered by WordPress. Hosted by Pressable. See our Privacy Policy.

  • Speaking
  • Sponsorship
  • About
  • Support
  • Education

We’re Syrus Partners.
We buy amazing businesses.


Farnam Street participates in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising commissions by linking to Amazon.