# Tag: Probability

## The Lucretius Problem: How History Blinds Us

The Lucretius Problem is a mental defect where we assume the worst-case event that has happened is the worst-case event that can happen. In so doing, we fail to understand that the worst event that has happened in the past surpassed the worst event that came before it. Only the fool believes all he can see is all there is to see.

***

It’s always good to re-read books and to dip back into them periodically. When reading a new book, I often miss out on crucial information (especially books that are hard to categorize with one descriptive sentence). When you come back to a book after reading hundreds of others, you can’t help but make new connections with the old book and see it anew. The book hasn’t changed, but you have.

It has been a while since I read Anti-fragile. In the past, I’ve talked about an Antifragile Way of Life, Learning to Love Volatility, the Definition of Antifragility, and the Noise and the Signal.

But upon re-reading Antifragile, I came across the Lucretius Problem, and I thought I’d share an excerpt. (Titus Lucretius Carus was a Roman poet and philosopher, best known for his poem On the Nature of Things).

In Antifragile, Nassim Taleb writes:

Indeed, our bodies discover probabilities in a very sophisticated manner and assess risks much better than our intellects do. To take one example, risk management professionals look in the past for information on the so-called worst-case scenario and use it to estimate future risks – this method is called “stress testing.” They take the worst historical recession, the worst war, the worst historical move in interest rates, or the worst point in unemployment as an exact estimate for the worst future outcome​. But they never notice the following inconsistency: this so-called worst-case event, when it happened, exceeded the worst [known] case at the time.

I have called this mental defect the Lucretius problem, after the Latin poetic philosopher who wrote that the fool believes that the tallest mountain in the world will be equal to the tallest one he has observed. We consider the biggest object of any kind that we have seen in our lives or hear about as the largest item that can possibly exist. And we have been doing this for millennia.

Taleb brings up an interesting point, which is that our documented history can blind us. All we know is what we have been able to record. There is an uncertainty that we don’t seem to grasp.

We think because we have sophisticated data collecting techniques that we can capture all the data necessary to make decisions. We think we can use our current statistical techniques to draw historical trends using historical data without acknowledging the fact that past data recorders had fewer tools to capture the dark figure of unreported data. We also overestimate the validity of what has been recorded before, and thus the trends we draw might tell a different story if we had the dark figure of unreported data.

Taleb continues:

The same can be seen in the Fukushima nuclear reactor, which experienced a catastrophic failure in 2011 when a tsunami struck. It had been built to withstand the worst past historical earthquake, with the builders not imagining much worse— and not thinking that the worst past event had to be a surprise, as it had no precedent. Likewise, the former chairman of the Federal Reserve, Fragilista Doctor Alan Greenspan, in his apology to Congress offered the classic “It never happened before.” Well, nature, unlike Fragilista Greenspan, prepares for what has not happened before, assuming worse harm is possible.

## Dealing with Uncertainty

Taleb provides an answer, which is to develop layers of redundancy, that is, a margin of safety, to act as a buffer against oneself. We overvalue what we have recorded and assume it tells us the worst and best possible outcomes. Redundant layers are a buffer against our tendency to think what has been recorded is a map of the whole terrain. An example of a redundant feature could be a rainy day fund that acts as an insurance policy against something catastrophic such as a job loss that allows you to survive and fight another day.

Antifragile is a great book to read, to learn something about yourself and the world.

## Fooled By Randomness: My Notes

I loved Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets by Nassim Taleb. This is the first popular book he wrote, the book that helped propel him into an intellectual celebrity. Interestingly, Fooled by Randomness contains semi-explored gems of the ideas that would later go on to become the best-selling books The Black Swan and Antifragile.

Here are some of my notes from the book.

## Hindsight Bias

Part of the argument that Fooled by Randomness presents is that when we look back at things that have happened we see them as less random than they actually were.

It is as if there were two planets: the one in which we actually live and the one, considerably more deterministic, on which people are convinced we live. It is as simple as that: Past events will always look less random than they were (it is called the hindsight bias). I would listen to someone’s discussion of his own past realizing that much of what he was saying was just backfit explanations concocted ex post by his deluded mind.

## The Courage of Montaigne

Writing on Montaigne as the role model for the modern thinker, Taleb also addresses his courage:

It certainly takes bravery to remain skeptical; it takes inordinate courage to introspect, to confront oneself, to accept one’s limitations— scientists are seeing more and more evidence that we are specifically designed by mother nature to fool ourselves.

## Probability

Fooled by Randomness is about probability, not in a mathematical way but as skepticism.

In this book probability is principally a branch of applied skepticism, not an engineering discipline. …

Probability is not a mere computation of odds on the dice or more complicated variants; it is the acceptance of the lack of certainty in our knowledge and the development of methods for dealing with our ignorance. Outside of textbooks and casinos, probability almost never presents itself as a mathematical problem or a brain teaser. Mother nature does not tell you how many holes there are on the roulette table , nor does she deliver problems in a textbook way (in the real world one has to guess the problem more than the solution).

Outside of textbooks and casinos, probability almost never presents itself as a mathematical problem” which is fascinating given how we tend to solve problems. In decisions under uncertainty, I discussed how risk and uncertainty are different things, which creates two types of ignorance.

Most decisions are not risk-based, they are uncertainty-based and you either know you are ignorant or you have no idea you are ignorant. There is a big distinction between the two. Trust me, you’d rather know you are ignorant.

## Randomness Disguised as Non-Randomness

The core of the book is about luck that we understand as skill or “randomness disguised as non-randomness (that is determinism).”

This problem manifests itself most frequently in the lucky fool, “defined as a person who benefited from a disproportionate share of luck but attributes his success to some other, generally very precise, reason.”

Such confusion crops up in the most unexpected areas, even science, though not in such an accentuated and obvious manner as it does in the world of business. It is endemic in politics, as it can be encountered in the shape of a country’s president discoursing on the jobs that “he” created, “his” recovery, and “his predecessor’s” inflation.

These lucky fools are often fragilistas — they have no idea they are lucky fools. For example:

[W]e often have the mistaken impression that a strategy is an excellent strategy, or an entrepreneur a person endowed with “vision,” or a trader a talented trader, only to realize that 99.9% of their past performance is attributable to chance, and chance alone. Ask a profitable investor to explain the reasons for his success; he will offer some deep and convincing interpretation of the results. Frequently, these delusions are intentional and deserve to bear the name “charlatanism.”

This does not mean that all success is luck or randomness. There is a difference between “it is more random than we think” and “it is all random.”

Let me make it clear here : Of course chance favors the prepared! Hard work, showing up on time, wearing a clean (preferably white) shirt, using deodorant, and some such conventional things contribute to success— they are certainly necessary but may be insufficient as they do not cause success. The same applies to the conventional values of persistence, doggedness and perseverance: necessary, very necessary. One needs to go out and buy a lottery ticket in order to win. Does it mean that the work involved in the trip to the store caused the winning? Of course skills count, but they do count less in highly random environments than they do in dentistry.

No, I am not saying that what your grandmother told you about the value of work ethics is wrong! Furthermore, as most successes are caused by very few “windows of opportunity,” failing to grab one can be deadly for one’s career. Take your luck!

That last paragraph connects to something Charlie Munger once said: “Really good investment opportunities aren’t going to come along too often and won’t last too long, so you’ve got to be ready to act. Have a prepared mind.

Taleb thinks of success in terms of degrees, so mild success might be explained by skill and labor but outrageous success “is attributable variance.”

## Luck Makes You Fragile

One thing Taleb hits on that really stuck with me is that “that which came with the help of luck could be taken away by luck (and often rapidly and unexpectedly at that). The flipside, which deserves to be considered as well (in fact it is even more of our concern), is that things that come with little help from luck are more resistant to randomness.” How Antifragile.

Taleb argues this is the problem of induction, “it does not matter how frequently something succeeds if failure is too costly to bear.”

## Noise and Signal

We are confused between noise and signal.

…the literary mind can be intentionally prone to the confusion between noise and meaning, that is, between a randomly constructed arrangement and a precisely intended message. However, this causes little harm; few claim that art is a tool of investigation of the Truth— rather than an attempt to escape it or make it more palatable. Symbolism is the child of our inability and unwillingness to accept randomness; we give meaning to all manner of shapes; we detect human figures in inkblots.

[…]

All my life I have suffered the conflict between my love of literature and poetry and my profound allergy to most teachers of literature and “critics.” The French thinker and poet Paul Valery was surprised to listen to a commentary of his poems that found meanings that had until then escaped him (of course, it was pointed out to him that these were intended by his subconscious).

If we’re concerned about situations where randomness is confused with non-randomness should we also be concerned with situations where non-randomness is mistaken for randomness, which would result in the signal being ignored?

First, I am not overly worried about the existence of undetected patterns. We have been reading lengthy and complex messages in just about any manifestation of nature that presents jaggedness (such as the palm of a hand, the residues at the bottom of Turkish coffee cups, etc.). Armed with home supercomputers and chained processors, and helped by complexity and “chaos” theories, the scientists, semiscientists, and pseudoscientists will be able to find portents. Second, we need to take into account the costs of mistakes; in my opinion, mistaking the right column for the left one is not as costly as an error in the opposite direction. Even popular opinion warns that bad information is worse than no information at all.

If you haven’t yet, pick up a copy of Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets. Don’t make the same mistake I did and wait to read this important book.

## Leonard Mlodinow: The Three Laws of Probability

“These three laws, simple as they are, form much of the basis of probability theory. Properly applied, they can give us much insight into the workings of nature and the everyday world.”

***

In his book, The Drunkard’s Walk, Leonard Mlodinow outlines the three key “laws” of probability.

The first law of probability is the most basic of all. But before we get to that, let’s look at this question.

Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.
Which is more probable?

Linda is a bank teller.
Linda is a bank teller and is active in the feminist movement.

To Kahneman and Tversky’s surprise, 87 percent of the subjects in the study believed that the probability of Linda being a bank teller and active in the feminist movement was a higher probability than the probability that Linda is a bank teller.

1. The probability that two events will both occur can never be greater than the probability that each will occur individually.

This is the conjunction fallacy.

Mlodinow explains:

Why not? Simple arithmetic: the chances that event A will occur = the chances that events A and B will occur + the chance that event A will occur and event B will not occur.

The interesting thing that Kahneman and Tversky discovered was that we don’t tend to make this mistake unless we know something about the subject.

“For example,” Mlodinow muses, “suppose Kahneman and Tversky had asked which of these statements seems most probable:”

Linda owns an International House of Pancakes franchise.
Linda had a sex-change operation and is now known as Larry.
Linda had a sex-change operation, is now known as Larry, and owns an International House of Pancakes franchise.

In this case, it’s unlikely you would choose the last option.

If the details we are given ﬁt our mental picture of something, then the more details in a scenario, the more real it seems and hence the more probable we consider it to be—even though any act of adding less-than-certain details to a conjecture makes the conjecture less probable.

Or as Kahneman and Tversky put it, “A good story is often less probable than a less satisfactory… .”

2. If two possible events, A and B, are independent, then the probability that both A and B will occur is equal to the product of their individual probabilities.

Suppose a married person has on average roughly a 1 in 50 chance of getting divorced each year. On the other hand, a police officer has about a 1 in 5,000 chance each year of being killed on the job. What are the chances that a married police officer will be divorced and killed in the same year? According to the above principle, if those events were independent, the chances would be roughly 1⁄50 × 1⁄5,000, which equals 1⁄250,000. Of course the events are not independent; they are linked: once you die, darn it, you can no longer get divorced. And so the chance of that much bad luck is actually a little less than 1 in 250,000.

Why multiply rather than add? Suppose you make a pack of trading cards out of the pictures of those 100 guys you’ve met so far through your Internet dating service, those men who in their Web site photos often look like Tom Cruise but in person more often resemble Danny DeVito. Suppose also that on the back of each card you list certain data about the men, such as honest (yes or no) and attractive (yes or no). Finally, suppose that 1 in 10 of the prospective soul mates rates a yes in each case. How many in your pack of 100 will pass the test on both counts? Let’s take honest as the first trait (we could equally well have taken attractive). Since 1 in 10 cards lists a yes under honest, 10 of the 100 cards will qualify. Of those 10, how many are attractive? Again, 1 in 10, so now you are left with 1 card. The first 1 in 10 cuts the possibilities down by 1⁄10, and so does the next 1 in 10, making the result 1 in 100. That’s why you multiply. And if you have more requirements than just honest and attractive, you have to keep multiplying, so . . . well, good luck.

And there are situations where probabilities should be added. That’s the next law.

“These occur when we want to know the chances of either one event or another occurring, as opposed to the earlier situation, in which we wanted to know the chance of one event and another event happening.”

3. If an event can have a number of different and distinct possible outcomes, A, B, C, and so on, then the probability that either A or B will occur is equal to the sum of the individual probabilities of A and B, and the sum of the probabilities of all the possible outcomes (A, B, C, and so on) is 1 (that is, 100 percent).

When you want to know the chances that two independent events, A and B, will both occur, you multiply; if you want to know the chances that either of two mutually exclusive events, A or B, will occur, you add. Back to our airline: when should the gate attendant add the probabilities instead of multiplying them? Suppose she wants to know the chances that either both passengers or neither passenger will show up. In this case she should add the individual probabilities, which according to what we calculated above, would come to 55 percent.

These three simple laws form the basis of probability. “Properly applied,” Mlodinow writes, “they can give us much insight into the workings of nature and the everyday world.” We use them all the time, we just don’t use them properly.

## Nassim Taleb on the Notion of Alternative Histories

We see what’s visible and available. Often this is nothing more than randomness and yet we wrap a narrative around it. The trader who is rich must know what he is doing. A good outcome means we made the right decisions, right? Not so quick. If we were wise we would not judge the quality of a decision on its outcome. There are alternative histories worth considering.

***

Writing in Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets, Nassim Taleb hits on the notion of alternative histories.

Taleb argues that we should judge people by the costs of the alternative, that is if history played out in another way. These “substitute courses of events are called alternative histories.”

Taleb writes:

Clearly, the quality of a decision cannot be solely judged based on its outcome, but such a point seems to be voiced only by people who fail (those who succeed attribute their success to the quality of their decision).

[…]

One can illustrate the strange concept of alternative histories as follows. Imagine an eccentric (and bored) tycoon offering you \$ 10 million to play Russian roulette, i.e., to put a revolver containing one bullet in the six available chambers to your head and pull the trigger. Each realization would count as one history, for a total of six possible histories of equal probabilities. Five out of these six histories would lead to enrichment; one would lead to a statistic, that is, an obituary with an embarrassing (but certainly original) cause of death. The problem is that only one of the histories is observed in reality; and the winner of \$ 10 million would elicit the admiration and praise of some fatuous journalist (the very same ones who unconditionally admire the Forbes 500 billionaires). Like almost every executive I have encountered during an eighteen-year career on Wall Street (the role of such executives in my view being no more than a judge of results delivered in a random manner), the public observes the external signs of wealth without even having a glimpse at the source (we call such source the generator.) Consider the possibility that the Russian roulette winner would be used as a role model by his family, friends, and neighbors.

While the remaining five histories are not observable, the wise and thoughtful person could easily make a guess as to their attributes. It requires some thoughtfulness and personal courage. In addition, in time, if the roulette-betting fool keeps playing the game, the bad histories will tend to catch up with him. Thus, if a twenty-five-year-old played Russian roulette, say, once a year, there would be a very slim possibility of his surviving until his fiftieth birthday— but, if there are enough players, say thousands of twenty-five-year-old players, we can expect to see a handful of (extremely rich) survivors (and a very large cemetery).

[…]

The reader can see my unusual notion of alternative accounting: \$ 10 million earned through Russian roulette does not have the same value as \$ 10 million earned through the diligent and artful practice of dentistry. They are the same, can buy the same goods, except that one’s dependence on randomness is greater than the other.

Reality is different than roulette. Consider that in the example above, while the result is unknown you know the odds, most of life is dealing with uncertainty. Bullets are infrequent, “like a revolver that would have hundreds, even thousands, of chambers instead of six.” After a while you forget about the bullet. You can’t see the chamber and we generally think of risk in terms of what is visible.

Interestingly this is the core of the black swan, which is really the induction problem. No amount of evidence can allow the inference that something is true whereas one counterexample can refute a conclusion. This idea is also related to the “denigration of history,” where we think things that happen to others would not happen to us.

## Richard Zeckhauser on Improving our Ability to Make Decisions

Richard Zeckhauser, aka Mr. Probability, was recently interviewed in Outlook Business.

Zeckhauser is a champion Bridge player and the Frank Ramsey professor of political economy at Harvard University.

When asked how companies can prevent overpaying for acquisition, Zeckhauser responded:

There is this tremendous optimism bias built into acquisitions. Synergies in my experience are frequently overstated. If I were looking at a large merger, I would hire a team in my corporation to present arguments to the board as to why we should not do it. The idea is to have a countervailing team to poke holes in the logic. Organisations have this tremendous tendency to get behind the boss and do what he thinks should be done, but you have to get away from that and motivate people to bring to the table something contrary to what is being said.

That bit of wisdom applies to more than just corporate acquisitions.

The problem is that people often blindly follow the boss and what s/he thinks should be done. The Hippo Problem. — The Highest Paid Person’s Opinion carries the day even if they are wrong. They are, after all, the authority figure.

Stanley Milgram demonstrated our obedience to authority through a series of experiments. Milgram summarized his most famous experiment in a 1974 article, The Perils of Obedience, writing:

I set up a simple experiment at Yale University to test how much pain an ordinary citizen would inflict on another person simply because he was ordered to by an experimental scientist. Stark authority was pitted against the subjects’ [participants’] strongest moral imperatives against hurting others, and, with the subjects’ [participants’] ears ringing with the screams of the victims, authority won more often than not. The extreme willingness of adults to go to almost any lengths on the command of an authority constitutes the chief finding of the study and the fact most urgently demanding explanation.

Ordinary people, simply doing their jobs, and without any particular hostility on their part, can become agents in a terrible destructive process. Moreover, even when the destructive effects of their work become patently clear, and they are asked to carry out actions incompatible with fundamental standards of morality, relatively few people have the resources needed to resist authority

Zeckhauser was also asked how we can make better decisions?

One part of decision-making is about how to place your priorities. Let me tell you what I said to a group of investment professionals recently. They were making investments and were being introduced to five fund managers. I said, “You have \$50 million to invest and you have five potential managers; that does not mean you have to give \$10 million to each of these managers. If you really think that manager A is much better, you should probably give him 25 and the others much smaller amounts.” Then, you improve your odds.

Here’s another example out of what I see in everyday life. You get 50 e-mails during the day and you answer 30 of them. On the one that you answer the most, you take 3 minutes. In all the others, you take 45 seconds. You should take 25 minutes to answer the one that is important, but you don’t. Once that is pointed out to you, you will say that is really obvious. In other words, you should decide what is really important and make your choices accordingly.

The other thing is about distinguishing between various probabilities. I think of making decisions the way I play tennis. I have taken many tennis lessons and my trainer always tells me the same three or four things. Keep your eye on the ball, get into position, swing your racquet back and swing the ball. I pay him \$75 to tell me “keep your eye on the ball” and he tells me the same thing over and over again because the natural tendency when you are playing tennis is to take your eye off the ball. The natural tendency when you are thinking about probabilistic situations is to marginalise probabilities — treat 1%, 5%, 10% and 15% probabilities all as low probabilities. I think it is worth your while before you take a decision to figure out whether it is going to be 1%, 5%, 10% or 20%. And when it is worthwhile and when it is not. But most people don’t bother to do that.

I am writing a paper today where we start off talking about President Obama’s assessment of the likelihood that Osama bin Laden was in the hideout where we found him to be. He had a variety of assessments and he eventually concluded well it was 50% likely that we were going to go get him. Now, there is nothing magical about 50%. It might be that it is perfectly worthwhile to go and raid that compound if the probability is only 30%. And maybe it is not worthwhile even if it is 70%. Think about that. But people feel that 50% is magical and they don’t like to do things where they don’t have 50% odds. I know that is not a good idea, so I am willing to make some bets where you say it is 20% likely to work but you get a big pay-off if it works, and only has a small cost if it does not. I will take that gamble. Most successful investments in new companies are where the odds are against you but, if you succeed, you will succeed in a big way.

To improve your ability to make better decisions and think probabilistically Zeckhauser recommends reading Thinking, Fast and Slow. If you’re looking for something less mainstream but equally insightful try Max Bazerman’s Judgment in Managerial Decision Making, which has been a favorite of mine for years.

***

Still curious? Zeckhauser is the author of a fascinating paper: Investing in the Unknown and Unknowable.

## Erik Hollnagel: The Search For Causes

A great passage from Erik Hollnagel‘s Barriers And Accident Prevention on the search for causes:

Whenever an accident happens there is a natural concern to find out in detail exactly what happened and to determine the causes of it. Indeed, whenever the result of an action or event falls significantly short of what was expected, or whenever something unexpected happens, people try to find an explanation for it. This trait of human nature is so strong that we try to find causes even when they do not exist, such as in the case of misleading or spurious correlations. For a number of reasons humans seem to be extremely reluctant to accept that something can happen by chance. One very good reason is that we have created a way of living that depends heavily on the use of technology, and that technological systems are built to function in a deterministic, hence reliable manner. If therefore something fails, we are fully justified in trying to find the reason for it. A second reason is that our whole understanding of the world is based on the assumption of specific relations between causes and effects, as amply illustrated by the Laws of Physics. (Even in quantum physics there are assumptions of more fundamental relations that are deterministic.) A third reason is that most humans find it very uncomfortable when they do not know what to expect, i.e., when things happen in an unpredictable manner. This creates a sense of being out of control, something that is never desirable since – from an evolutionary perspective – it means that the chances of survival are reduced.

This was described by Friedrich Nietzsche when he wrote:

[T]o trace something unknown back to something known is alleviating, soothing, gratifying and gives moreover a feeling of power. Danger, disquiet, anxiety attend the unknown – the first instinct is to eliminate these distressing states. First principle: any explanation is better than none … The cause-creating drive is thus conditioned and excited by the feeling of fear.

Hollnagel, continues:

A well-known example of this is provided by the phenomenon called the gambler’s fallacy. The name refers to the fact that gamblers often seem to believe that a long row of events of one type increases the probability of the complementary event. Thus if a series of ‘red’ events occur on a roulette wheel, the gambler’s fallacy lead people to believe that the probability of ‘black’ increases. … Rather than accepting that the underlying mechanism may be random people invent all kinds of explanations to reduce the uncertainty of future events.