Category: Decision Making

The Biological Bases of Human Resilience

In Stronger: Develop the Resilience You Need to Succeed there is a section on the Biological Bases of Human Resilience that is fascinating.

It turns out the key to developing resilience at the biological level is to interpret experience in a way that increases performance and facilitates homeostasis. Changing attitudes and developing resilience through training is key. Optimism is your friend.

In 1975, neurologist Paul MacLean coined the term triune brain to describe its three functional levels: The neocortex is the most sophisticated component of the human brain, representing its highest functioning level. Not only does the neocortex interpret sensory signals, communications, and gross proprioceptive-based control of motor (musculoskeletal) behaviors, but part of it—the ventromedial prefrontal cortex (vmPFC)—presides over imagination, logic, decision making, problem solving, planning, apprehension, and, most important, the interpretation of experience.

It is the vmPFC that labels an experience (real or imagined) as threatening, punishing, or rewarding. It finds solutions to problems, it sees the opportunity in danger, and it sees the glass as half full rather than half empty. Based on the nature of the interpretation of experience, the vmPFC then activates the second level of the triune brain: the limbic system.

The limbic system is relevant in any discussion of stress and resilience because of its role as the human brain’s emotional control center. The limbic system is believed to be just that, a system, consisting of numerous highly connected neural structures, for example, the hypothalamus, hippocampus, septum, cingulate gyrus, and the amygdala. The amygdala is the primary anatomic center for fear, anger, trauma, and aggression. It’s also the center of the fight-or-flight response, a term coined by psychologist Walter Cannon in 1915. The amygdala serves as the primary survival mechanism in the human body. Thus, it’s a key anatomic component in the biology of resilience.

The brain stem and spinal cord represent the lowest level of the triune brain. The major functions of this level are the maintenance of so-called vegetative roles such as heartbeat, respiration, vasomotor activity, and the conduction of impulses to many higher levels of the brain. The spinal cord represents the central pathway for neurons as they conduct signals to and from the brain. The brain stem is the basic engine that drives the machinery of the human body.

Human resilience represents a most elegant and ongoing dance between the vmPFC and the amygdala. When faced with danger, the vmPFC activates the amygdala so as to prepare you to fight, flee, or otherwise resolve the threat. Highly resilient people appear to be able to effectively regulate the amygdala so as to benefit from its activation but then allow it to quickly recover its baseline activity. This process of recovery to a steady state is what Cannon called “the reestablishment of homeostasis.”

Consequently, the bodies of resilient people are supercharged with moderate increases in hormones such as adrenalin, noradrenalin, gamma-Aminobutyric acid, neuropeptide Y and Cortisol,which allow you to do “superhuman” things for short periods of time. When these hormones surge, your strength and perception
increase, your memory improves, your eyesight may get better, your tolerance for pain increases, and you react to stimuli faster. In other words, you’re better prepared to meet any challenge success fully.

The person who is not resilient experiences homeostatic failure, during which the vmPFC interpretations either overstimulate or understimulate the limbic system. The result of overstimulation can be anxiety, panic attacks, confusion, reduced problem-solving capacity, irritability, anger, even violence (for example, road rage, airline rage), and seizures. The result of under stimulation may be hopelessness, depression, resentment, and a lack of motivation. With highly frequent or chronic overstimulation the amygdala can develop a state of chronic hypersensitivity at the cellular level. Amygdaloid nerve cells literally become highly irritable and will over-respond to experiences that would have not otherwise caused excitation. It’s like having 10 cups of coffee.

Keeping Things Simple and Tuning out Folly

Keeping things simple makes a huge difference and yet we are drawn to the sexiness of complexity. Einstein was a master of sifting the essential from the non-essential.

And consider this from Charlie Munger: The Complete Investor:

Peter Bevelin’s book Seeking Wisdom: From Darwin to Munger has a section on the importance of simplicity.

Bevelin advised: “Turn complicated problems into simple ones. Break down a problem into its components, but look at the problem holistically.” Keeping things as simple as possible, but no more so, is a constant theme in Munger’s public statements. In a joint letter to shareholders, Munger and Buffett once wrote: “Simplicity has a way of improving performance through enabling us to better understand what we are doing.”

[…]

By focusing on finding decisions and bets that are easy, avoiding what is hard, and stripping away anything that is extraneous, Munger believes that an investor can make better decisions. By “tuning out folly” and swatting away unimportant things “so your mind isn’t cluttered with them … you’re better able to pick up a few sensible things to do,” said Munger. Focus enables both simplicity and clarity of thought, which in Munger’s view leads to a more positive investing result.

“If something is too hard, we move on to something else. What could be simpler than that?”

— Charlie Munger

There is a compelling advantage in life to be found in exploiting unrecognized simplicities, something Peter Thiel tries to tease out in interviews. Essential to recognizing simplicity is scheduling time to think.

“We have three baskets: in, out, and too tough… We have to have a special insight, or we’ll put it in the too tough basket.”

— Charlie Munger

Simplicity is Filtering

William James said: “The art of being wise is the art of knowing what to overlook.” And there are no truer words that have been spoken.

In Arthur Conan Doyle’s The Reigate Puzzle, Sherlock Holmes says: “It is of the highest importance in the art of detection to be able to recognize, out of a number of facts, which are incidental and which vital.”

And part of filtering is understanding what you know and what you don’t know, that is, understanding your circle of competence.

“We have a passion for keeping things simple.”

— Charlie Munger

In an interview with Jason Zweig, Munger said:

Confucius said that real knowledge is knowing the extent of one’s ignorance. Aristotle and Socrates said the same thing. Is it a skill that can be taught or learned? It probably can, if you have enough of a stake riding on the outcome. Some people are extraordinarily good at knowing the limits of their knowledge, because they have to be. Think of somebody who’s been a professional tightrope walker for 20 years—and has survived. He couldn’t survive as a tightrope walker for 20 years unless he knows exactly what he knows and what he doesn’t know. He’s worked so hard at it, because he knows if he gets it wrong he won’t survive. The survivors know.

Another time he offered:

Part of that [having uncommon sense], I think, is being able to tune out folly, as distinguished from recognizing wisdom. You’ve got whole categories of things you just bat away so your brain isn’t cluttered with them. That way, you’re better able to pick up a few sensible things to do.

Warren Buffett, the CEO of Berkshire Hathaway agrees:

Yeah, we don’t consider many stupid things. I mean, we get rid of ’em fast.. Just getting rid of the nonsense — just figuring out that if people call you and say, “I’ve got this great, wonderful idea”, you don’t spend 10 minutes once you know in the first sentence that it isn’t a great, wonderful idea… Don’t be polite and go through the whole process.

And Peter Bevelin, writing in Seeking Wisdom, offers:

Often we try to get too much information, including misinformation, or information of no use to explain or predict. We also focus on details and what’s irrelevant or unknowable and overlook the obvious truths. Dealing with what’s important forces us to prioritize. There are often just a few actions that produce most of what we are trying to achieve. There are only a few decisions of real importance.

More information doesn’t equal more knowledge or better decisions. And remember that today we not only have access to more information, but also misinformation.

And the harder we work at something the more confident we become.

It’s worth pausing to reflect on three things at this point: 1) understanding and seeking simplicity; 2) dealing with the easy problems first; and 3) honing your skills by learning what to overlook and getting rid of bad ideas quickly (how many organizations do that!?)… this goes hand in hand with understanding your circle of competence.

Focusing Illusions

focusing illusions

My favorite chapter in the book Rapt: Attention and the Focused Life by Winifred Gallagher is called ‘Decisions: Focusing Illusions.’ It’s a really great summary of how focusing on the wrong things affects the weights we use to make decisions. There is a lot of great content packed into this chapter but I’ll attempt to highlight a few points.

***
Bounded Rationality

According to the principle of ‘bounded rationality,’ which (Daniel) Kahneman first applied to economic decisions and more recently to choices concerning quality of life, we are reasonable-enough beings but sometimes liable to focus on the wrong things. Our thinking gets befuddled not so much by our emotions as by our ‘cognitive illusions,’ or mistaken intuitions, and other flawed, fragmented mental constructs.

***
Loss/Risk Aversion

If you’re pondering a choice that involves risk, you might focus too much on the threat of possible loss, thereby obscuring an even likelier potential benefit. Where this common scenario is concerned, research shows that we aren’t so much risk-averse as loss-averse, in that we’re generally much more sensitive to what we might have to give up than to what we might gain.

***
The Focusing Illusion

The key to understanding why you pay more attention to your thoughts about living than to life itself is neatly summed up by what Kahneman proudly calls his ‘fortune cookie maxim’ (a.k.a the focusing illusion): ‘Nothing in life is as important as you think it is while you are thinking about it.’ Why? ‘Because you’re thinking about it!

In one much-cited illustration of the focusing illusion, Kahneman asked some people if they would be happier if they lived in California. Because the climate is often delightful there, most subjects thought so. For the same reason, even Californians assume they’re happier than people who live elsewhere. When Kahneman actually measured their well-being however, Michiganders and others are just as contented as Californians. The reason is that 99 percent of the stuff of life – relationships, work, home, recreation – is the same no matter where you are, and once you settle in a place, no matter how salubrious, you don’t think about it’s climate very much. If you’re prompted to evaluate it, however, the weather immediately looms large, simply because you’re paying attention to it. This illusion inclines you to accentuate the difference between Place A and Place B, making it seem to matter much more than it really does, which is marginal.

To test the fortune cookie rule, you have only to ask yourself how happy you are. The question automatically summons your remembering self, which will focus on any recent change in your life – marriage or divorce, new job or home. You’ll then think about this novel event, which in turn will increase its import and influence your answer. If you’re pleased that you’ve just left the suburbs for the city, say, you’ll decide that life is pretty good. If you regret the move, you’ll be dissatisfied in general. Fifteen years on, however, the change that looms so large now will pale next to a more recent event – a career change, perhaps or becoming a grandparent – which will draw your focus and, simply because you’re thinking about it, bias your evaluation of your general well-being.

***
The Effects of Adaptation

Like focusing too much on the opinions of your remembering self, overlooking the effects of adaptation – the process of becoming used to a situation – can obstruct wise decisions about how to live. As Kahneman says, ‘when planning for the future, we don’t consider that we will stop paying attention to a thing.

The tendency to stop focusing on a particular event or experience over time, no matter how wonderful or awful, helps explain why the differences in well-being between groups of people in very different circumstances tend to be surprisingly small – sometimes astoundingly so. The classic examples are paraplegics and lottery winners, who respectively aren’t nearly as miserable or happy as you’d think. ‘That’s where attention comes in,’ says Kahneman. ‘People think that if they win the lottery, they’ll be happy forever. Of course, they will not. For a while, they are happy because of the novelty, and because they think about winning all the time. Then they adapt and stop paying attention to it.’ Similarly, he says, ‘Everyone is surprised by how happy paraplegics can be, but they are not paraplegic full-time. They do other things. They enjoy their meals, their friends, the newspaper. It has to do with the allocation of attention.’

Like couples who’ve just fallen in love, professionals starting a career, or children who go to camp for the first time, paraplegics and lottery winners initially pay a lot of attention to their new situation. Then, like everybody else, they get used to it and shift their focus to the next big thing. Their seemingly blase attitude surprises us, because when we imagine ourselves in their place, we focus on how we’d feel at the moment of becoming paralyzed or wildly rich, when such an event utterly monopolizes one’s focus. We forget that we, too, would get used to wealth, a wheelchair, and most other things under the sun, then turn our attention elsewhere.

***
Good Enough

Finally, don’t worry if the choice you made wasn’t the absolute best, as long as it meets your needs. Offering the single most important lesson from his research, Schwartz says, ‘Good enough is almost always good enough. If you have that attitude, many problems about decisions and much paralysis melt away.’

Charlie Munger and the Pursuit of Worldly Wisdom

Charlie Munger, the billionaire business partner of Warren Buffett and a major inspiration behind this site, is not only one of the best investors the world has witnessed, but he’s also one of the best thinkers. A quick recap is in order.

Munger has illuminated timeless wisdom in the minds’ search algorithm, Academic Economics, stretch goals, mental models, the value of thinking backward and forward, problem-solving, partnerships, and inversion among others.

“What is elementary, worldly wisdom? Well, the first rule is that you can’t really know anything if you just remember isolated facts and try and bang ‘em back. If the facts don’t hang together on a latticework of theory, you don’t have them in a usable form.”

— Charlie Munger

He’s even offered two sets of book recommendations.

In his book, Charlie Munger: The Complete Investor, which has become my new go-to recommendation for people interested in an introduction to Munger’s thinking. Griffin lays out Munger’s path to worldly wisdom.

Munger has adopted an approach to business and life that he refers to as worldly wisdom. Munger believes that by using a range of different models from many different disciplines—psychology, history, mathematics, physics, philosophy, biology, and so on—a person can use the combined output of the synthesis to produce something that has more value than the sum of its parts. Robert Hagstrom wrote a wonderful book on worldly wisdom entitled Investing: The Last Liberal Art, in which he states that “each discipline entwines with, and in the process strengthens, every other. From each discipline the thoughtful person draws significant mental models, the key ideas that combine to produce a cohesive understanding. Those who cultivate this broad view are well on their way to achieving worldly wisdom.”

It is clear that Munger loves to learn. He actually has fun when he is learning, and that makes the worldly wisdom investing process enjoyable for him. This is important because many people do not find investing enjoyable, especially when compared to gambling, which science has shown can generate pleasure via chemicals (e.g., dopamine) even though it is an activity with a negative net present value. What Munger has done is created a system—worldly wisdom—that allows him to generate the same chemical rewards in an activity that has a positive net present value. When you learn something new, your brain gives itself a chemical reward, which motivates you to do the work necessary to be a successful investor. If you do this work and adopt a worldly wisdom mindset, Munger believes you will create an investing edge over other investors.

Munger used a latticework of mental models in developing his approach. Herbert Simon, in his autobiography, Models of My Life, captured the idea of a mental model when he said:

A large part of the difference between the experienced decision maker and the novice in these situations is not any particular intangible like “judgment” or “intuition.” If one could open the lid, so to speak, and see what was in the head of the experienced decision maker, one would find that he had at his disposal repertoires of possible actions; that he had checklists of things to think about before he acted; and that he had mechanisms in his mind to evoke these, and bring these to his conscious attention when the situations for decisions arose.

“You’ve got to have models in your head. And you’ve got to array your experience—both vicarious and direct—on this latticework of models.”

— Charlie Munger

Latticework

Munger carefully chose the latticework model, to convey the idea that things are interconnected. We need more than a deep understanding of one segment, we need a working knowledge of all of them and how they interact and link. This is conveyed by the Japanese proverb “The frog in a well knows nothing of the mighty ocean.”

In Charlie Munger: The Complete Investor, Griffin continues:

Understanding the worldly wisdom methodology is made easier if you see it applied in an example. To illustrate the method, Munger gave the example of a business that raises the price of its product and yet sells more of that product. This would appear to violate the rule of supply and demand as taught in economics. However, if one thinks about the discipline of psychology, one might conclude that the product is a Geffen good, which people desire more of at higher prices. Or one could conclude that low prices signal poor quality to buyers and that raising prices will result in more sales. Alternatively, you can look for bias caused by incentives and discover that what has actually happened in his example is that the seller has bribed the purchasing agents of the purchasers.

Munger described a situation in which this actually happens:

Suppose you’re the manager of a mutual fund, and you want to sell more. People commonly come to the following answer: You raise the commissions which, of course, reduces the number of units of real investments delivered to the ultimate buyer, so you’re increasing the price per unit of real investment that you’re selling the ultimate customer. And you’re using that extra commission to bribe the customer’s purchasing agent. You’re bribing the broker to betray his client and put the client’s money into the high-commission product.

While we can’t know everything, we can know the big ideas from multiple disciplines. We don’t want to be the frog. This is one way we can add value in the decision-making process and it allows for the effective use of what I call the Munger two-step, which is something we talk about at Re:Think Decision Making.

“Simply put,” Griffin writes, “Munger believes that people who think very broadly and understand many different models from many different disciplines make better decisions.”

In a 2003 speech as UCB Business School entitled Academic Economics — Strengths and Weaknesses, after Considering Interdisciplinary Needs, Munger said:

You’ve got a complex system and it spews out a lot of wonderful numbers that enable you to measure some factors. But there are other factors that are terribly important, [yet] there’s no precise numbering you can put to these factors. You know they’re important, but you don’t have the numbers. Well, practically (1) everybody overweighs the stuff that can be numbered, because it yields to the statistical techniques they’re taught in academia, and (2) doesn’t mix in the hard-to-measure stuff that may be more important. That is a mistake I’ve tried all my life to avoid, and I have no regrets for having done that.

Worldly Wise

Griffin continues:

In Munger’s view, it is better to be worldly wise than to spend lots of time working with a single model that is precisely wrong.

A multiple-model approach that is only approximately right will produce a far better outcome in anything that involves people or a social system. While making the case for a lattice of mental models approach (described here shortly), Robert Hagstrom pointed out that Munger is providing support for those who advocate for a wide-ranging liberal arts education.

Munger would be what the poet Archilochus calls a fox. The ancient Greeks said “The fox knows many things; the hedgehog one great thing.”

“The theory of modern education is that you need a general education before you specialize. And I think to some extent, before you’re going to be a great stock picker, you need some general education.”

— Charlie Munger

Commenting on Munger, former Microsoft CEO Bill Gates said, “(he) is truly the broadest thinker I have ever encountered.” Buffett added that Munger has “the best 30-second mind in the world. He goes from A to Z in one move. He sees the essence of everything before you even finish the sentence.”

How does Munger do this? That’s a question worth slowing down and thinking about.

“You have to realize the truth of biologist Julian Huxley’s idea that ‘Life is just one damn relatedness after another.’ So you must have the models, and you must see the relatedness and the effects from the relatedness.”

— Charlie Munger

Where do we get these ideas? We let history be our guide. If one way to ensure you make poor decisions is to use a small sample size, we can reason that we should seek out the biggest sample sizes we can.

What crosses most of history? Biology, Chemistry, Physics. And of course, throw in some Psychology so we can better understand how we are led astray.

Thinking

Griffin continues:

Munger’s breadth of knowledge is something that is naturally part of his character but also something that he intentionally cultivates. In his view, to know nothing about an important subject is to invite problems. Both Munger and Buffett set aside plenty of time each day to just think. Anyone reading the news is provided with constant reminders of the consequences of not thinking. Thinking is a surprisingly underrated activity. Researchers published a study in 2014 that revealed that approximately a quarter of women and two-thirds of men chose electric shocks over spending time alone with their own thoughts.

If you can’t be alone with your thoughts, you don’t deserve them. We try to run away from the pain of thinking. Instead, we turn to the instant gratification of Netflix.

Munger’s speeches and essays are filled with the thoughts of great people from the past and present from many different domains. Munger is also careful to set aside a lot of time in his schedule for reading. To say he loves books is an understatement. Buffett has said that Munger has read hundreds of biographies, as just one example. He is very purposeful in his approach to worldly wisdom, preferring not to fill his calendar with appointments and meetings.

When talking about how to get smarter, Buffett said: “You could hardly find a partnership in which two people settle on reading more hours of the day than in ours.” Adding, “Look, my job is essentially just corralling more and more and more facts and information, and occasionally seeing whether that leads to some action.”

This is where we go astray so easily. It’s easy to think about our own discipline, the one we live in on a daily basis. This comes naturally. However, it’s likely to lead to problems. We become the proverbial man with a hammer, “To the man with a hammer everything looks like a nail. If you only have one model you will fit whatever problem you face to the model you have.” It’s hard work to think. That’s why so few people seem to take this passage. But it should be hard, otherwise, it would be too easy.

“I believe in the discipline of mastering the best that other people have ever figured out. I don’t believe in just sitting down and trying to dream it all up yourself. Nobody’s that smart.”

— Charlie Munger

A Multidisciplinary Approach

The tagline for this website: Mastering the best of what other people have already figured out, comes from the Munger quote above.

Griffin continues:

It is critical for a person who desires to be wise to think broadly and learn from others. Munger has said many times that someone who is really smart but has devoted all of their time to being an expert in a narrow area may be dangerous to themselves and others. Examples of this include macroeconomists who study the economy but are disastrous when investing their own portfolios and marketing experts who may think that most all business problems can be solved through marketing. Financiers tend to think similarly about their own profession. Too many people believe that what they do at work is hard and what others do is easy.

The best approach is the multi-disciplinary one, Munger argues:

You may say, “My God, this is already getting way too tough.” But, fortunately, it isn’t that tough—because eighty or ninety important models will carry about 90 percent of the freight in making you a worldly wise person. And, of those, only a mere handful really carry very heavy freight.

“This reference to eighty or ninety important models” Griffin writes, “has caused people to ask Munger for a complete list of these important models.”

While Munger identified many models in the discipline of psychology in his famous The Psychology of Human Misjudgment speech and mentioned other models on an ad-hoc basis, he has never prepared a complete list covering all disciplines.

Munger believes that by learning to recognize certain dysfunctional decision-making processes, an investor can learn to make fewer mistakes. He also believes that no matter how hard someone works and learns, mistakes cannot be completely eliminated. The best one can hope for is to reduce their frequency and, hopefully, their magnitude.

Munger elaborates in a 1995 speech at Harvard University:

Man’s imperfect, limited-capacity brain easily drifts into working with what’s easily available to it. And the brain can’t use what it can’t remember or when it’s blocked from recognizing because it’s heavily influenced by one or more psychological tendencies bearing strongly on it … the deep structure of the human mind requires that the way to full scope competency of virtually any kind is to learn it all to fluency—like it or not.

In a 2007 speech at USC law school, Munger says:

I constantly see people rise in life who are not the smartest, sometimes not even the most diligent, but they are learning machines. They go to bed every night a little wiser than they were when they got up, and boy, does that help, particularly when you have a long run ahead of you. … So if civilization can progress only with an advanced method of invention, you can progress only when you learn the method of learning. Nothing has served me better in my long life than continuous learning. I went through life constantly practicing (because if you don’t practice it, you lose it) the multidisciplinary approach and I can’t tell you what that’s done for me. It’s made life more fun, it’s made me more constructive, it’s made me more helpful to others, and it’s made me enormously rich. You name it, that attitude really helps.

Back to Griffin, writing in Charlie Munger: The Complete Investor, who says:

In looking at a decision, Munger believes that it is wise to ask questions. Have dysfunctional decision-making heuristics from psychology caused an error? Are there approaches one can use to find those mistakes? Munger likes to use a model from algebra and invert problems to find a solution. Looking for models that can reveal and explain mistakes so one can accumulate worldly wisdom is actually lots of fun. It is like a puzzle to be solved.

A lattice approach is, in effect, a double-check on the investing process. But instead of just two checks, you are checking the result over and over. Munger believes that by going over your decision-making process and carefully using skills, ideas, and models from many disciplines, you can more consistently not be stupid. You will always make some bone-headed mistakes even if you’re careful, but his process is designed to decrease the probability of those mistakes.

To make sure he is taking advantage of as many models as possible, Munger likes checklists. At the 2002 Berkshire Hathaway annual meeting, Munger said:

“You need a different checklist and different mental models for different companies. I can never make it easy by saying, “Here are three things.” You have to derive it yourself to ingrain it in your head for the rest of your life.”

— Charlie Munger

Learning From Mistakes

An aspect of Munger’s approach is learning from your mistakes. I’ve made more than my fair share of them.

“I like people admitting they were complete stupid horses’ asses. I know I’ll perform better if I rub my nose in my mistakes. This is a wonderful trick to learn.”

— Charlie Munger

Griffin writes:

Munger has said repeatedly that he made more mistakes earlier in life than he is making now. One of his early mistakes was to own a company that made electrical transformers. He has also said that he has found himself in real estate ventures that would only be enjoyed by a masochist. He seems to have more tolerance for mistakes in real estate than other areas of business. The idea of building things as opposed to just trading stocks has a particular appeal to Munger.

Munger believes that one great way to avoid mistakes is to own a business that is simple to understand, given your education and experience. He pointed out: “Where you have complexity, by nature you can have fraud and mistakes.” This approach echoes the view of Buffett, who likes challenges that are the business equivalent of netting fish in a barrel.

Buffett has said that if you cannot explain why you failed after you have made a mistake, the business was too complex for you. In other words, Munger and Buffett like to understand why they made a mistake so they can learn from the experience. If you cannot understand the business, then you cannot determine what you did wrong. If you cannot determine what you did wrong, then you cannot learn. If you cannot learn, you will not know what you’re doing, which is the real cause of risk.

“Forgetting your mistakes is a terrible error if you’re trying to improve your cognition. Reality doesn’t remind you. Why not celebrate stupidities in both categories?”

— Charlie Munger

Griffin concludes:

Munger has chosen the word wisdom purposefully because he believes that mere knowledge, especially from only one domain, is not enough. To be wise, one must also have experience, common sense, and good judgment. How one actually applies these things in life is what makes a person wise.

Charlie Munger: The Complete Investor is about more than investing, it’s about the pursuit of wisdom across boundaries.

The Two Types of Knowledge: The Max Planck/Chauffeur Test

Charlie Munger, the billionaire business partner of Warren Buffett, frequently tells the story below to illustrate how to distinguish between the two types of knowledge: real knowledge and pretend knowledge.

At the 2007 Commencement to the USC Law School, Munger explained it this way:

I frequently tell the apocryphal story about how Max Planck, after he won the Nobel Prize, went around Germany giving the same standard lecture on the new quantum mechanics.

Over time, his chauffeur memorized the lecture and said, “Would you mind, Professor Planck, because it’s so boring to stay in our routine. [What if] I gave the lecture in Munich and you just sat in front wearing my chauffeur’s hat?” Planck said, “Why not?” And the chauffeur got up and gave this long lecture on quantum mechanics. After which a physics professor stood up and asked a perfectly ghastly question. The speaker said, “Well I’m surprised that in an advanced city like Munich I get such an elementary question. I’m going to ask my chauffeur to reply.”

The point of the story is not the quick-wittedness of the protagonist, but rather — to echo Richard Feynman — it’s about making a distinction between knowing the name of something and knowing something.

Two Types of Knowledge

Munger continues:

In this world we have two kinds of knowledge. One is Planck knowledge, the people who really know. They’ve paid the dues, they have the aptitude. And then we’ve got chauffeur knowledge. They’ve learned the talk. They may have a big head of hair, they may have fine temper in the voice, they’ll make a hell of an impression.

But in the end, all they have is chauffeur knowledge. I think I’ve just described practically every politician in the United States.

And you are going to have the problem in your life of getting the responsibility into the people with the Planck knowledge and away from the people with the chauffeur knowledge.

And there are huge forces working against you. My generation has failed you a bit… but you wouldn’t like it to be too easy now would you?

Real knowledge comes when people do the work. This is so important that Elon Musk tries to tease it out in interviews.

On the other hand, we have the people who don’t do the work — they pretend. While they’ve learned to put on a good show, they lack understanding. They can’t answer questions that don’t rely on memorization. They can’t explain things without using jargon or vague terms. They have no idea how things interact. They can’t predict the consequences.

“Any fool can know. The point is to understand.”

— Albert Einstein

The problem is that it’s difficult to separate the two. This is the Batesian Mimicry problem. One way to tease out the difference between Planck and chauffeur knowledge is to ask them why.

In The Art of Thinking Clearly, Rolf Dobelli offers some commentary on distinguishing fake from real knowledge:

With journalists, it is more difficult. Some have acquired true knowledge. Often they are veteran reporters who have specialized for years in a clearly defined area. They make a serious effort to understand the complexity of a subject and to communicate it. They tend to write long articles that highlight a variety of cases and exceptions. The majority of journalists, however, fall into the category of chauffeur. They conjure up articles off the tops of their heads or, rather, from Google searches. Their texts are one-sided, short, and— often as compensation for their patchy knowledge— snarky and self-satisfied in tone.

The same superficiality is present in business. The larger a company, the more the CEO is expected to possess “star quality.” Dedication, solemnity, and reliability are undervalued, at least at the top. Too often shareholders and business journalists seem to believe that showmanship will deliver better results, which is obviously not the case.

One way to guard against this is to understand your circle of competence.

Dobelli concludes with some advice worth taking to heart.

Be on the lookout for chauffeur knowledge. Do not confuse the company spokesperson, the ringmaster, the newscaster, the schmoozer, the verbiage vendor, or the cliché generator with those who possess true knowledge. How do you recognize the difference? There is a clear indicator: True experts recognize the limits of what they know and what they do not know. If they find themselves outside their circle of competence, they keep quiet or simply say, “I don’t know.” This they utter unapologetically, even with a certain pride. From chauffeurs, we hear every line except this.

Making Decisions in a Complex Adaptive System

complexadaptive

In Think Twice: Harnessing the Power of Counterintuition, Mauboussin does a good job adding to the work we’ve already done on complex adaptive systems:

You can think of a complex adaptive system in three parts (see the image at the top of this post). First, there is a group of heterogeneous agents. These agents can be neurons in your brain, bees in a hive, investors in a market, or people in a city. Heterogeneity means each agent has different and evolving decision rules that both reflect the environment and attempt to anticipate change in it. Second, these agents interact with one another, and their interactions create structure— scientists often call this emergence. Finally, the structure that emerges behaves like a higher-level system and has properties and characteristics that are distinct from those of the underlying agents themselves. … The whole is greater than the sum of the parts.

***

The inability to understand the system based on its components prompted Nobel Prize winner and physicist Philip Anderson, to draft the essay, “More Is Different.” Anderson wrote, “The behavior of large and complex aggregates of elementary particles, it turns out, is not to be understood in terms of the simple extrapolation of the properties of a few particles. Instead, at each level of complexity entirely new properties appear.”

Mauboussin comments that we are fooled by randomness:

The problem goes beyond the inscrutable nature of complex adaptive systems. Humans have a deep desire to understand cause and effect, as such links probably conferred humans with evolutionary advantage. In complex adaptive systems, there is no simple method for understanding the whole by studying the parts, so searching for simple agent-level causes of system-level effects is useless. Yet our minds are not beyond making up a cause to relieve the itch of an unexplained effect. When a mind seeking links between cause and effect meets a system that conceals them, accidents will happen.

***
Misplaced Focus on the Individual

One mistake we make is extrapolating the behaviour of an individual component, say an individual, to explain the entire system. Yet when we have to solve a problem dealing with a complex system, we often address an individual component. In so doing, we ignore Garrett Hardin’s first law of Ecology, you can never do merely one thing and become a fragilista.

That unintended system-level consequences arise from even the best-intentioned individual-level actions has long been recognized. But the decision-making challenge remains for a couple of reasons. First, our modern world has more interconnected systems than before. So we encounter these systems with greater frequency and, most likely, with greater consequence. Second, we still attempt to cure problems in complex systems with a naïve understanding of cause and effect.

***

When I speak with executives from around the world going through a period of poor performance, it doesn’t take long for them to mention they want to hire a star from another company. “If only we had Kate,” they’ll say, “we could smash the competition and regain our footing.”

At first, poaching stars from competitors or even teams within the same organization seems like a winning strategy. But once the star comes over the results often fail to materialize.

What we fail to grasp is that their performance is part of an ecosystem and removing them from that ecosystem — that is isolating the individual performance — is incredibly hard without properly considering the entire ecosystem. (Reversion to the mean also likely accounts for some of the star’s fading as well).

Three Harvard professors concluded, “When a company hires a star, the star’s performance plunges, there is a sharp decline in the functioning of the group or team the person works with, and the company’s market value falls.”

If it sounds like a lot of work to think this through at many levels, it should be. Why should it be easy?

Another example of this at an organizational level has to do with innovation. Most people want to solve the innovation problem. Ignoring for a second that that is the improper framing, how do most organizations go about this? They copy what the most successful organizations do. I can’t count the number of times the solution to an organization’s “innovation problem” is to be more like Google. Well-intentioned executives blindly copy approaches by others such as 20% innovation time, without giving an ounce of thought to the role the ecosystem plays.

Isolating and focusing on an individual part of a complex adaptive system without an appreciation and understanding of that system itself is sure to lead to disaster.

***
What Should We Do?

So this begs the question, what should we do when we find ourselves dealing with a complex adaptive system? Mauboussin provides three pieces of advice:

1. Consider the system at the correct level.

Remember the phrase “more is different.” The most prevalent trap is extrapolating the behavior of individual agents to gain a sense of system behavior. If you want to understand the stock market, study it at the market level. Consider what you see and read from individuals as entertainment, not as education. Similarly, be aware that the function of an individual agent outside the system may be very different from that function within the system. For instance, mammalian cells have the same metabolic rates in vitro, whether they are from shrews or elephants. But the metabolic rate of cells in small mammals is much higher than the rate of those in large mammals. The same structural cells work at different rates, depending on the animals they find themselves in.

2. Watch for tightly coupled systems.

A system is tightly coupled when there is no slack between items, allowing a process to go from one stage to the next without any opportunity to intervene. Aircraft, space missions, and nuclear power plants are classic examples of complex, tightly coupled systems. Engineers try to build in buffers or redundancies to avoid failure, but frequently don’t anticipate all possible contingencies. Most complex adaptive systems are loosely coupled, where removing or incapacitating one or a few agents has little impact on the system’s performance. For example, if you randomly remove some investors, the stock market will continue to function fine. But when the agents lose diversity and behave in a coordinated fashion, a complex adaptive system can behave in a tightly coupled fashion. Booms and crashes in financial markets are an illustration.

3. Use simulations to create virtual worlds.

Dealing with complex systems is inherently tricky because the feedback is equivocal, information is limited, and there is no clear link between cause and effect. Simulation is a tool that can help our learning process. Simulations are low cost, provide feedback, and have proved their value in other domains like military planning and pilot training.

Still Curious? Think Twice: Harnessing the Power of Counterintuition.