Tag: Peter Bevelin

Peter Bevelin on Seeking Wisdom, Mental Models, Learning, and a Lot More

One of the most impactful books we’ve ever come across is the wonderful Seeking Wisdom: From Darwin to Munger, written by the Swedish investor Peter Bevelin. In the spirit of multidisciplinary learning, Seeking Wisdom is a compendium of ideas from biology, psychology, statistics, physics, economics, and human behavior.

Mr. Bevelin is out with a new book full of wisdom from Warren Buffett & Charlie Munger: All I Want to Know is Where I’m Going to Die So I Never Go There. We were fortunate enough to have a chance to interview Peter recently, and the result is the wonderful discussion below.

***

What was the original impetus for writing these books?

The short answer: To improve my thinking. And when I started writing on what later became Seeking Wisdom I can express it even simpler: “I was dumb and wanted to be less dumb.” As Munger says: “It’s ignorance removal…It’s dishonorable to stay stupider than you have to be.” And I had done some stupid things and I had seen a lot of stupidity being done by people in life and in business.

A seed was first planted when I read Charlie Munger’s worldly wisdom speech and another one where he referred to Darwin as a great thinker. So I said to myself: I am 42 now. Why not take some time off business and spend a year learning, reflecting and write about the subject Munger introduced to me – human behavior and judgments.

None of my writings started out as a book project. I wrote my first book – Seeking Wisdom – as a memorandum for myself with the expectation that I could transfer some of its essentials to my children. I learn and write because I want to be a little wiser day by day. I don’t want to be a great-problem-solver. I want to avoid problems – prevent them from happening and doing right from the beginning. And I focus on consequential decisions. To paraphrase Buffett and Munger – decision-making is not about making brilliant decisions, but avoiding terrible ones. Mistakes and dumb decisions are a fact of life and I’m going to make more, but as long as I can avoid the big or “fatal” ones I’m fine.

So I started to read and write to learn what works and not and why. And I liked Munger’s “All I want to know is where I’m going to die so I’ll never go there” approach. And as he said, “You understand it better if you go at it the way we do, which is to identify the main stupidities that do bright people in and then organize your patterns for thinking and developments, so you don’t stumble into those stupidities.” Then I “only” had to a) understand the central “concept” and its derivatives and describe it in as simple way as possible for me and b) organize what I learnt in a way that was logical and useful for me.

And what better way was there to learn this from those who already knew this?

After I learnt some things about our brain, I understood that thinking doesn’t come naturally to us humans – most is just unconscious automatic reactions. Therefore I needed to set up the environment and design a system that helped me make it easier to know what to do and prevent and avoid harm. Things like simple rules of thumbs, tricks and filters. Of course, I could only do that if I first had the foundation. And as the years have passed, I’ve found that filters are a great way to save time and misery. As Buffett says, “I process information very quickly since I have filters in my mind.” And they have to be simple – as the proverb says, “Beware of the door that has too many keys.” The more complicated a process is, the less effective it is.

Why do I write? Because it helps me understand and learn better. And if I can’t write something down clearly, then I have not really understood it. As Buffett says, “I learn while I think when I write it out. Some of the things, I think I think, I find don’t make any sense when I start trying to write them down and explain them to people … And if it can’t stand applying pencil to paper, you’d better think it through some more.”

My own test is one that a physicist friend of mine told me many years ago, ‘You haven’t really understood an idea if you can’t in a simple way describe it to almost anyone.’ Luckily, I don’t have to understand zillion of things to function well.

And even if some of mine and others thoughts ended up as books, they are all living documents and new starting points for further, learning, un-learning and simplifying/clarifying. To quote Feynman, “A great deal of formulation work is done in writing the paper, organizational work, organization. I think of a better way, a better way, a better way of getting there, of proving it. I never do much — I mean, it’s just cleaner, cleaner and cleaner. It’s like polishing a rough-cut vase. The shape, you know what you want and you know what it is. It’s just polishing it. Get it shined, get it clean, and everything else.

Which book did you learn the most from the experience of writing/collecting?

Seeking Wisdom because I had to do a lot of research – reading, talking to people etc. Especially in the field of biology and brain science since I wanted to first understand what influences our behavior. I also spent some time at a Neurosciences Institute to get a better understanding of how our anatomy, physiology and biochemistry constrained our behavior.

And I had to work it out my own way and write it down in my own words so I really could understand it. It took a lot of time but it was a lot of fun to figure it out and I learnt much more and it stuck better than if I just had tried to memorize what somebody else had already written. I may not have gotten everything letter perfect but good enough to be useful for me.

As I said, the expectation wasn’t to create a book. In fact, that would have removed a lot of my motivation. I did it because I had an interest in becoming better. It goes back to the importance of intrinsic motivation. As I wrote in Seeking Wisdom: “If we reward people for doing what they like to do anyway, we sometimes turn what they enjoy doing into work. The reward changes their perception. Instead of doing something because they enjoy doing it, they now do it because they are being paid. The key is what a reward implies. A reward for our achievements makes us feel that we are good at something thereby increasing our motivation. But a reward that feels controlling and makes us feel that we are only doing it because we’re paid to do it, decreases the appeal.

It may sound like a cliché but the joy was in the journey – reading, learning and writing – not the destination – the finished book. Has the book made a difference for some people? Yes, I hope so but often people revert to their old behavior. Some of them are the same people who – to paraphrase something that is attributed to Churchill – occasionally should check their intentions and strategies against their results. But reality is what Munger once said, “Everyone’s experience is that you teach only what a reader almost knows, and that seldom.” But I am happy that my books had an impact and made a difference to a few people. That’s enough.

Why did the new book (All I Want To Know Is Where I’m Going To Die So I’ll Never Go There) have a vastly different format?

It was more fun to write about what works and not in a dialogue format. But also because vivid and hopefully entertaining “lessons” are easier to remember and recall. And you will find a lot of quotes in there that most people haven’t read before.

I wanted to write a book like this to reinforce a couple of concepts in my head. So even if some of the text sometimes comes out like advice to the reader, I always think about what the mathematician Gian-Carlo Rota once said, “The advice we give others is the advice that we ourselves need.”

How do you define Mental Models?

Some kind of representation that describes how reality is (as it is known today) – a principle, an idea, basic concepts, something that works or not – that I have in my head that helps me know what to do or not. Something that has stood the test of time.

For example some timeless truths are:

  • Reality is that complete competitors – same product/niche/territory – cannot coexist (Competitive exclusion principle). What works is going where there is no or very weak competition + differentiation/advantages that others can’t copy (assuming of course we have something that is needed/wanted now and in the future)
  • Reality is that we get what we reward for. What works is making sure we reward for what we want to achieve.

I favor underlying principles and notions that I can apply broadly to different and relevant situations. Since some models don’t resemble reality, the word “model” for me is more of an illustration/story of an underlying concept, trick, method, what works etc. that agrees with reality (as Munger once said, “Models which underlie reality”) and help me remember and more easily make associations.

But I don’t judge or care how others label it or do it – models, concepts, default positions … The important thing is that whatever we use, it reflects and agrees with reality and that it works for us to help us understand or explain a situation or know what to do or not do. Useful and good enough guide me. I am pretty pragmatic – whatever works is fine. I follow Deng Xiaoping, “I don’t care whether the cat is black or white as long as it catches mice.” As Feynman said, “What is the best method to obtain the solution to a problem? The answer is, any way that works.

I’ll tell you about a thing Feynman said on education which I remind myself of from time to time in order not to complicate things (from Richard P. Feynman, Michael A. Gottlieb, Ralph Leighton, Feynman’s Tips on Physics: A Problem-Solving Supplement to the Feynman Lectures on Physics):

“There’s a round table on three legs. Where should you lean on it, so the table will be the most unstable?”
The student’s solution was, “Probably on top of one of the legs, but let me see: I’ll calculate how much force will produce what lift, and so on, at different places.”
Then I said, “Never mind calculating. Can you imagine a real table?”
“But that’s not the way you’re supposed to do it!”
“Never mind how you’re supposed to do it; you’ve got a real table here with the various legs, you see? Now, where do you think you’d lean? What would happen if you pushed down directly over a leg?”
“Nothin’!”
I say, “That’s right; and what happens if you push down near the edge, halfway between two of the legs?”
“It flips over!”
I say, “OK! That’s better!”
The point is that the student had not realized that these were not just mathematical problems; they described a real table with legs. Actually, it wasn’t a real table, because it was perfectly circular, the legs were straight up and down, and so on. But it nearly described, roughly speaking, a real table, and from knowing what a real table does, you can get a very good idea of what this table does without having to calculate anything – you know darn well where you have to lean to make the table flip over. So, how to explain that, I don’t know! But once you get the idea that the problems are not mathematical problems but physical problems, it helps a lot.
Anyway, that’s just two ways of solving this problem. There’s no unique way of doing any specific problem. By greater and greater ingenuity, you can find ways that require less and less work, but that takes experience.

Which mental models “carry the most freight?” (Related follow up: Which concepts from Buffett/Munger/Mental Models do you find yourself referring to or appreciating most frequently?)

Ideas from biology and psychology since many stupidities are caused by not understanding human nature (and you get illustrations of this nearly every day). And most of our tendencies were already known by the classic writers (Publilius Syrus, Seneca, Aesop, Cicero etc.)

Others that I find very useful both in business and private is the ideas of Quantification (without the fancy math), Margin of safety, Backups, Trust, Constraints/Weakest link, Good or Bad Economics slash Competitive advantage, Opportunity cost, Scale effects. I also think Keynes idea of changing your mind when you get new facts or information is very useful.

But since reality isn’t divided into different categories but involves a lot of factors interacting, I need to synthesize many ideas and concepts.

Are there any areas of the mental models approach you feel are misunderstood or misapplied?

I don’t know about that but what I often see among many smart people agrees with Munger’s comment: “All this stuff is really quite obvious and yet most people don’t really know it in a way where they can use it.”

Anyway, I believe if you really understand an idea and what it means – not only memorizing it – you should be able to work out its different applications and functional equivalents. Take a simple big idea – think on it – and after a while you see its wider applications. To use Feynman’s advice, “It is therefore of first-rate importance that you know how to “triangulate” – that is, to know how to figure something out from what you already know.” As a good friend says, “Learn the basic ideas, and the rest will fill itself in. Either you get it or you don’t.”

Most of us learn and memorize a specific concept or method etc. and learn about its application in one situation. But when the circumstances change we don’t know what to do and we don’t see that the concept may have a wider application and can be used in many situations.

Take for example one big and useful idea – Scale effects. That the scale of size, time and outcomes changes things – characteristics, proportions, effects, behavior…and what is good or not must be tied to scale. This is a very fundamental idea from math. Munger described some of this idea’s usefulness in his worldly wisdom speech. One effect from this idea I often see people miss and I believe is important is group size and behavior. That trust, feeling of affection and altruistic actions breaks down as group size increases, which of course is important to know in business settings. I wrote about this in Seeking Wisdom (you can read more if you type in Dunbar Number on Google search). I know of some businesses that understand the importance of this and split up companies into smaller ones when they get too big (one example is Semco).

Another general idea is “Gresham’s Law” that can be generalized to any process or system where the bad drives out the good. Like natural selection or “We get what we select for” (and as Garrett Hardin writes, “The more general principle is: We get whatever we reward for).

While we are on the subject of mental models etc., let me bring up another thing that distinguishes the great thinkers from us ordinary mortals. Their ability to quickly assess and see the essence of a situation – the critical things that really matter and what can be ignored. They have a clear notion of what they want to achieve or avoid and then they have this ability to zoom in on the key factor(s) involved.

One reason to why they can do that is because they have a large repertoire of stored personal and vicarious experiences and concepts in their heads. They are masters at pattern recognition and connection. Some call it intuition but as Herbert Simon once said, “The situation has provided a cue; this cue has given the expert access to information stored in memory, and the information provides the answer. Intuition is nothing more and nothing less than recognition.

It is about making associations. For example, roughly like this:
Situation X Association (what does this remind me of?) to experience, concept, metaphor, analogy, trick, filter… (Assuming of course we are able to see the essence of the situation) What counts and what doesn’t? What works/not? What to do or what to explain?

Let’s take employing someone as an example (or looking at a business proposal). This reminds me of one key factor – trustworthiness and Buffett’s story, “If you’re looking for a manager, find someone who is intelligent, energetic and has integrity. If he doesn’t have the last, make sure he lacks the first two.”

I believe Buffett and Munger excel at this – they have seen and experienced so much about what works and not in business and behavior.

Buffett referred to the issue of trust, chain letters and pattern recognition at the latest annual meeting:

You can get into a lot of trouble with management that lacks integrity… If you’ve got an intelligent, energetic guy or woman who is pursuing a course of action, which gets put on the front page it could make you very unhappy. You can get into a lot of trouble. ..We’ve seen patterns…Pattern recognition is very important in evaluating humans and businesses. Pattern recognition isn’t one hundred percent and none of the patterns exactly repeat themselves, but there are certain things in business and securities markets that we’ve seen over and over and frequently come to a bad end but frequently look extremely good in the short run. One which I talked about last year was the chain letter scheme. You’re going to see chain letters for the rest of your life. Nobody calls them chain letters because that’s a connotation that will scare you off but they’re disguised as chain letters and many of the schemes on Wall Street, which are designed to fool people, have that particular aspect to it…There were patterns at Valeant certainly…if you go and watch the Senate hearings, you will see there are patterns that should have been picked up on.

This is what he wrote on chain letters in the 2014 annual report:

In the late 1960s, I attended a meeting at which an acquisitive CEO bragged of his “bold, imaginative accounting.” Most of the analysts listening responded with approving nods, seeing themselves as having found a manager whose forecasts were certain to be met, whatever the business results might be. Eventually, however, the clock struck twelve, and everything turned to pumpkins and mice. Once again, it became evident that business models based on the serial issuances of overpriced shares – just like chain-letter models – most assuredly redistribute wealth, but in no way create it. Both phenomena, nevertheless, periodically blossom in our country – they are every promoter’s dream – though often they appear in a carefully-crafted disguise. The ending is always the same: Money flows from the gullible to the fraudster. And with stocks, unlike chain letters, the sums hijacked can be staggering.

And of course, the more prepared we are or the more relevant concepts and “experiences” we have in our heads, the better we all will be at this. How do we get there? Reading, learning and practice so we know it “fluently.” There are no shortcuts. We have to work at it and apply it to the real world.

As a reminder to myself so I understand my limitation and “circle”, I keep a paragraph from Munger’s USC Gould School of Law Commencement Address handy so when I deal with certain issues, I don’t fool myself into believing I am Max Planck when I’m really the Chauffeur:

In this world I think we have two kinds of knowledge: One is Planck knowledge, that of the people who really know. They’ve paid the dues, they have the aptitude. Then we’ve got chauffeur knowledge. They have learned to prattle the talk. They may have a big head of hair. They often have fine timbre in their voices. They make a big impression. But in the end what they’ve got is chauffeur knowledge masquerading as real knowledge.

Which concepts from Buffett/Munger/Mental Models do you find most counterintuitive?

One trick or notion I see many of us struggling with because it goes against our intuition is the concept of inversion – to learn to think “in negatives” which goes against our normal tendency to concentrate on for example, what we want to achieve or confirmations instead of what we want to avoid and disconfirmations. Another example of this is the importance of missing confirming evidence (I call it the “Sherlock trick”) – that negative evidence and events that don’t happen, matter when something implies they should be present or happen.

Another example that is counterintuitive is Newton’s 3d law that forces work in pairs. One object exerts a force on a second object, but the second object also exerts a force equal and opposite in direction to the force acting on it – the first object. As Newton wrote, “If you press a stone with your finger, the finger is also pressed by the stone.” Same as revenge (reciprocation).

Who are some of the non-obvious, or under-the-radar thinkers that you greatly admire?

One that immediately comes to mind is one I have mentioned in the introduction in two of my books is someone I am fortunate to have as a friend – Peter Kaufman. An outstanding thinker and a great businessman and human being. On a scale of 1 to 10, he is a 15.

What have you come to appreciate more with Buffett/Munger’s lessons as you’ve studied them over the years?

Their ethics and their ethos of clarity, simplicity and common sense. These two gentlemen are outstanding in their instant ability to exclude bad ideas, what doesn’t work, bad people, scenarios that don’t matter, etc. so they can focus on what matters. Also my amazement that their ethics and ideas haven’t been more replicated. But I assume the answer lies in what Munger once said, “The reason our ideas haven’t spread faster is they’re too simple.”

This reminds me something my father-in-law once told me (a man I learnt a lot from) – the curse of knowledge and the curse of academic title. My now deceased father-in-law was an inventor and manager. He did not have any formal education but was largely self-taught. Once a big corporation asked for his services to solve a problem their 60 highly educated engineers could not solve. He solved the problem. The engineers said, “It can’t be that simple.” It was like they were saying that, “Here we have 6 years of school, an academic title, lots of follow up education. Therefore an engineering problem must be complicated”. Like Buffett once said of Ben Graham’s ideas, “I think that it comes down to those ideas – although they sound so simple and commonplace that it kind of seems like a waste to go to school and get a PhD in Economics and have it all come back to that. It’s a little like spending eight years in divinity school and having somebody tell you that the 10 commandments were all that counted. There is a certain natural tendency to overlook anything that simple and important.”

(I must admit that in the past I had a tendency to be extra drawn to elegant concepts and distracting me from the simple truths.)

What things have you come to understand more deeply in the past few years?

  • That I don’t need hundreds of concepts, methods or tricks in my head – there are a few basic, time-filtered fundamental ones that are good enough. As Munger says, “The more basic knowledge you have the less new knowledge you have to get.” And when I look at something “new”, I try to connect it to something I already understand and if possible get a wider application of an already existing basic concept that I already have in my head.
  • Neither do I have to learn everything to cover every single possibility – not only is it impossible but the big reason is well explained by the British statistician George Box. He said that we shouldn’t be preoccupied with optimal or best procedures but good enough over a range of possibilities likely to happen in practice – circumstances which the world really present to us.
  • The importance of “Picking my battles” and focus on the long-term consequences of my actions. As Munger said, “A majority of life’s errors are caused by forgetting what one is really trying to do.”
  • How quick most of us are in drawing conclusions. For example, I am often too quick in being judgmental and forget how I myself behaved or would have behaved if put in another person’s shoes (and the importance of seeing things from many views).
  • That I have to “pick my poison” since there is always a set of problems attached with any system or approach – it can’t be perfect. The key is try to move to a better set of problems one can accept after comparing what appear to be the consequences of each.
  • How efficient and simplified life is when you deal with people you can trust. This includes the importance of the right culture.
  • The extreme importance of the right CEO – a good operator, business person and investor.
  • That luck plays a big role in life.
  • That most predictions are wrong and that prevention, robustness and adaptability is way more important. I can’t help myself – I have to add one thing about the people who give out predictions on all kinds of things. Often these are the people who live in a world where their actions have no consequences and where their ideas and theories don’t have to agree with reality.
  • That people or businesses that are foolish in one setting often are foolish in another one (“The way you do anything, is the way you do everything”).
  • Buffett’s advice that “A checklist is no substitute for thinking.” And that sometimes it is easy to overestimate one’s competency in a) identifying or picking what the dominant or key factors are and b) evaluating them including their predictability. That I believe I need to know factor A when I really need to know B – the critical knowledge that counts in the situation with regards to what I want to achieve.
  • Close to this is that I sometimes get too involved in details and can’t see the forest for the trees and I get sent up too many blind alleys. Just as in medicine where a whole body scan sees too much and sends the doctor up blind alleys.
  • The wisdom in Buffett’s advice that “You only have to be right on a very, very few things in your lifetime as long as you never make any big mistakes…An investor needs to do very few things right as long as he or she avoids big mistakes.”

What’s the best investment of time/effort/money that you’ve ever made?

The best thing I have done is marrying my wife. As Buffett says and it is so so true, “Choosing a spouse is the most important decision in your life…You need everything to be stable, and if that decision isn’t good, it may affect every other decision in life, including your business decisions…If you are lucky on health and…on your spouse, you are a long way home.”

A good “investment” is taking the time to continuously improve. It just takes curiosity and a desire to know and understand – real interest. And for me this is fun.

What does your typical day look like? (How much time do you spend reading… and when?)

Every day is a little different but I read every day.

What book has most impacted your life?

There is not one single book or one single idea that has done it. I have picked up things from different books (still do). And there are different books and articles that made a difference during different periods of my life. Meeting and learning from certain people and my own practical experiences has been more important in my development. As an example – When I was in my 30s a good friend told me something that has been very useful in looking at products and businesses. He said I should always ask who the real customer is: “Who ultimately decides what to buy and what are their decision criteria and how are they measured and rewarded and who pays?

But looking back, if I have had a book like Poor Charlie’s Almanack when I was younger I would have saved myself some misery. And of course, when it comes to business, managing and investing, nothing beats learning from Warren Buffett’s Letters to Berkshire Hathaway Shareholders.

Another thing I have found is that it is way better to read and reread fewer books but good and timeless ones and then think. Unfortunately many people absorb too many new books and information without thinking.

Let me finish this with some quotes from my new book that I believe we all can learn from:

  • “There’s no magic to it…We haven’t succeeded because we have some great, complicated systems or magic formulas we apply or anything of the sort. What we have is just simplicity itself.” – Buffett
  • “Our ideas are so simple that people keep asking us for mysteries when all we have are the most elementary ideas…There’s nothing remarkable about it. I don’t have any wonderful insights that other people don’t have. Just slightly more consistently than others, I’ve avoided idiocy…It is remarkable how much long-term advantage people like us have gotten by trying to be consistently not stupid, instead of trying to be very intelligent.” – Munger
  • “It really is simple – just avoid doing the dumb things. Avoiding the dumb things is the most important.” – Buffett

Finally, I wish you and your readers an excellent day – Everyday!

 

Keeping Things Simple and Tuning out Folly

“We have a passion for keeping things simple.”

— Charlie Munger

This reminds me of me of how Einstein sifted the essential from the non-essential. And this from Charlie Munger: The Complete Investor:

Peter Bevelin’s book Seeking Wisdom: From Darwin to Munger has a section on the importance of simplicity.

Bevelin advised: “Turn complicated problems into simple ones. Break down a problem into its components, but look at the problem holistically.” Keeping things as simple as possible, but no more so, is a constant theme in Munger’s public statements. In a joint letter to shareholders, Munger and Buffett once wrote: “Simplicity has a way of improving performance through enabling us to better understand what we are doing.”

[…]

By focusing on finding decisions and bets that are easy, avoiding what is hard, and stripping away anything that is extraneous, Munger believes that an investor can make better decisions. By “tuning out folly” and swatting away unimportant things “so your mind isn’t cluttered with them … you’re better able to pick up a few sensible things to do,” said Munger. Focus enables both simplicity and clarity of thought, which in Munger’s view leads to a more positive investing result.

“If something is too hard, we move on to something else. What could be simpler than that?”

— Charlie Munger

There is a compelling advantage in life to be found in exploiting unrecognized simplicities, something Peter Thiel tries to tease out in interviews. Essential to recognizing simplicity is scheduling time to think.

“We have three baskets: in, out, and too tough… We have to have a special insight, or we’ll put it in the too tough basket.”

— Charlie Munger

Part of Simplicity is Filtering

William James said: “The art of being wise is the art of knowing what to overlook.” And there are no truer words that have been spoken.

In Arthur Conan Doyle’s The Reigate Puzzle, Sherlock Holmes says: “It is of the highest importance in the art of detection to be able to recognize, out of a number of facts, which are incidental and which vital.”

And part of filtering is understanding what you know and what you don’t know, that is, understanding your circle of competence.

In an interview with Jason Zweig, Munger said:

Confucius said that real knowledge is knowing the extent of one’s ignorance. Aristotle and Socrates said the same thing. Is it a skill that can be taught or learned? It probably can, if you have enough of a stake riding on the outcome. Some people are extraordinarily good at knowing the limits of their knowledge, because they have to be. Think of somebody who’s been a professional tightrope walker for 20 years—and has survived. He couldn’t survive as a tightrope walker for 20 years unless he knows exactly what he knows and what he doesn’t know. He’s worked so hard at it, because he knows if he gets it wrong he won’t survive. The survivors know.

Another time he offered:

Part of that [having uncommon sense], I think, is being able to tune out folly, as distinguished from recognizing wisdom. You’ve got whole categories of things you just bat away so your brain isn’t cluttered with them. That way, you’re better able to pick up a few sensible things to do.

Warren Buffett, the CEO of Berkshire Hathaway agrees:

Yeah, we don’t consider many stupid things. I mean, we get rid of ’em fast.. Just getting rid of the nonsense — just figuring out that if people call you and say, “I’ve got this great, wonderful idea”, you don’t spend 10 minutes once you know in the first sentence that it isn’t a great, wonderful idea… Don’t be polite and go through the whole process.

And Peter Bevelin, writing in Seeking Wisdom, offers:

Often we try to get too much information, including misinformation, or information of no use to explain or predict. We also focus on details and what’s irrelevant or unknowable and overlook the obvious truths. Dealing with what’s important forces us to prioritize. There are often just a few actions that produce most of what we are trying to achieve. There are only a few decisions of real importance.

More information doesn’t equal more knowledge or better decisions. And remember that today we not only have access to more information, but also misinformation.

And the harder we work at something the more confident we become.

It’s worth pausing to reflect on three things at this point: 1) understanding and seeking simplicity; 2) dealing with the easy problems first; and 3) honing your skills by learning what to overlook and getting rid of bad ideas quickly (how many organizations do that!?)… this goes hand in hand with understanding your circle of competence.

Regression Toward the Mean: An Introduction with Examples

regression to the mean

It is important to minimize instances of bad judgment and address the weak spots in our reasoning. Learning about regression to the mean can help us.

Nobel prize-winning psychologist Daniel Kahneman wrote a book about biases that cloud our reasoning and distort our perception of reality. It turns out there is a whole set of logical errors that we commit because our intuition and brains do not deal well with simple statistics. One of the errors that he examines in Thinking Fast and Slow is the infamous regression toward the mean.

The notion of regression to the mean was first worked out by Sir Francis Galton. The rule goes that, in any series with complex phenomena that are dependent on many variables, where chance is involved, extreme outcomes tend to be followed by more moderate ones.

In Seeking Wisdom, Peter Bevelin offers the example of John, who was dissatisfied with the performance of new employees so he put them into a skill-enhancing program where he measured the employees’ skill:

Their scores are now higher than they were on the first test. John’s conclusion: “The skill-enhancing program caused the improvement in skill.” This isn’t necessarily true. Their higher scores could be the result of regression to the mean. Since these individuals were measured as being on the low end of the scale of skill, they would have shown an improvement even if they hadn’t taken the skill-enhancing program. And there could be many reasons for their earlier performance — stress, fatigue, sickness, distraction, etc. Their true ability perhaps hasn’t changed.

Our performance always varies around some average true performance. Extreme performance tends to get less extreme the next time. Why? Testing measurements can never be exact. All measurements are made up of one true part and one random error part. When the measurements are extreme, they are likely to be partly caused by chance. Chance is likely to contribute less on the second time we measure performance.

If we switch from one way of doing something to another merely because we are unsuccessful, it’s very likely that we do better the next time even if the new way of doing something is equal or worse.

This is one of the reasons it’s dangerous to extrapolate from small sample sizes, as the data might not be representative of the distribution. It’s also why James March argues that the longer someone stays in their job, “the less the probable difference between the observed record of performance and actual ability.” Anything can happen in the short run, especially in any effort that involves a combination of skill and luck. (The ratio of skill to luck also impacts regression to the mean.)

“Regression to the mean is not a natural law. Merely a statistical tendency. And it may take a long time before it happens.”

— Peter Bevelin

Regression to the Mean

The effects of regression to the mean can frequently be observed in sports, where the effect causes plenty of unjustified speculations.

In Thinking Fast and Slow, Kahneman recalls watching men’s ski jump, a discipline where the final score is a combination of two separate jumps. Aware of the regression to the mean, Kahneman was startled to hear the commentator’s predictions about the second jump. He writes:

Norway had a great first jump; he will be tense, hoping to protect his lead and will probably do worse” or “Sweden had a bad first jump and now he knows he has nothing to lose and will be relaxed, which should help him do better.

Kahneman points out that the commentator had noticed the regression to the mean and come up with a story for which there was no causal evidence (see narrative fallacy). This is not to say that his story could not be true. Maybe, if we measured the heart rates before each jump, we would see that they are more relaxed if the first jump was bad. However, that’s not the point. The point is, regression to the mean happens when luck plays a role, as it did in the outcome of the first jump.

The lesson from sports applies to any activity where chance plays a role. We often attach explanations of our influence over a particular process to the progress or lack of it.

In reality, the science of performance is complex, situation dependent and often much of what we think is within our control is truly random.

In the case of ski jumps, a strong wind against the jumper will lead to even the best athlete showing mediocre results. Similarly, a strong wind and ski conditions in favor of a mediocre jumper may lead to a considerable, but a temporary bump in his results. These effects, however, will disappear once the conditions change and the results will regress back to normal.

This can have serious implications for coaching and performance tracking. The rules of regression suggest that when evaluating performance or hiring, we must rely on track records more than outcomes of specific situations. Otherwise, we are prone to be disappointed.

When Kahneman was giving a lecture to Israeli Air Force about the psychology of effective training, one of the officers shared his experience that extending praise to his subordinates led to worse performance, whereas scolding led to an improvement in subsequent efforts. As a consequence, he had grown to be generous with negative feedback and had become rather wary of giving too much praise.

Kahneman immediately spotted that it was regression to the mean at work. He illustrated the misconception by a simple exercise you may want to try yourself. He drew a circle on a blackboard and then asked the officers one by one to throw a piece of chalk at the center of the circle with their backs facing the blackboard. He then repeated the experiment and recorded each officer’s performance in the first and second trial.

Naturally, those that did incredibly well on the first try tended to do worse on their second try and vice versa. The fallacy immediately became clear: the change in performance occurs naturally. That again is not to say that feedback does not matter at all – maybe it does, but the officer had no evidence to conclude it did.

The Imperfect Correlation and Chance

At this point, you might be wondering why the regression to the mean happens and how we can make sure we are aware of it when it occurs.

In order to understand regression to the mean, we must first understand correlation.

The correlation coefficient between two measures which varies between -1 and 1, is a measure of the relative weight of the factors they share. For example, two phenomena with few factors shared, such as bottled water consumption versus suicide rate, should have a correlation coefficient of close to 0. That is to say, if we looked at all countries in the world and plotted suicide rates of a specific year against per capita consumption of bottled water, the plot would show no pattern at all.

no correlation
No Correlation

On the contrary, there are measures which are solely dependent on the same factor. A good example of this is temperature. The only factor determining temperature – velocity of molecules — is shared by all scales, hence each degree in Celsius will have exactly one corresponding value in Fahrenheit. Therefore temperature in Celsius and Fahrenheit will have a correlation coefficient of 1 and the plot will be a straight line.

Perfect Correlation
Perfect Correlation

There are few if any phenomena in human sciences that have a correlation coefficient of 1. There are, however, plenty where the association is weak to moderate and there is some explanatory power between the two phenomena. Consider the correlation between height and weight, which would land somewhere between 0 and 1. While virtually every three-year-old will be lighter and shorter than every grown man, not all grown men or three-year-olds of the same height will weigh the same.

Weak Correlation
Weak to Moderate Correlation

This variation and the corresponding lower degree of correlation implies that, while height is generally speaking a good predictor, there clearly are factors other than the height at play. When the correlation of two measures is less than perfect, we must watch out for the effects of regression to the mean.

Kahneman observed a general rule: Whenever the correlation between two scores is imperfect, there will be regression to the mean.

This at first might seem confusing and not very intuitive, but the degree of regression to the mean is directly related to the degree of correlation of the variables. This effect can be illustrated with a simple example.

Assume you are at a party and ask why it is that highly intelligent women tend to marry men who are less intelligent than they are. Most people, even those with some training in statistics, will quickly jump in with a variety of causal explanations ranging from avoidance of competition to the fears of loneliness that these females face. A topic of such controversy is likely to stir up a great debate.

Now, what if we asked why the correlation between the intelligence scores of spouses is less than perfect? This question is hardly as interesting and there is little to guess – we all know this to be true. The paradox lies in the fact that the two questions happen to be algebraically equivalent. Kahneman explains:

[…] If the correlation between the intelligence of spouses is less than perfect (and if men and women on average do not differ in intelligence), then it is a mathematical inevitability that highly intelligent women will be married to husbands who are on average less intelligent than they are (and vice versa, of course). The observed regression to the mean cannot be more interesting or more explainable than the imperfect correlation.

Assuming that correlation is imperfect, the chances of two partners representing the top 1% in terms of any characteristic is far smaller than one partner representing the top 1% and the other – the bottom 99%.

The Cause, Effect, and Treatment

We should be especially wary of the regression to the mean phenomenon when trying to establish causality between two factors. Whenever correlation is imperfect, the best will always appear to get worse and the worst will appear to get better over time, regardless of any additional treatment. This is something that the general media and sometimes even trained scientists fail to recognize.

Consider the example Kahneman gives:

Depressed children treated with an energy drink improve significantly over a three-month period. I made up this newspaper headline, but the fact it reports is true: if you treated a group of depressed children for some time with an energy drink, they would show a clinically significant improvement. It is also the case that depressed children who spend some time standing on their head or hug a cat for twenty minutes a day will also show improvement.

Whenever coming across such headlines it is very tempting to jump to the conclusion that energy drinks, standing on the head or hugging cats are all perfectly viable cures for depression. These cases, however, once again embody the regression to the mean:

Depressed children are an extreme group, they are more depressed than most other children—and extreme groups regress to the mean over time. The correlation between depression scores on successive occasions of testing is less than perfect, so there will be regression to the mean: depressed children will get somewhat better over time even if they hug no cats and drink no Red Bull.

We often mistakenly attribute a specific policy or treatment as the cause of an effect, when the change in the extreme groups would have happened anyway. This presents a fundamental problem: how can we know if the effects are real or simply due to variability?

Luckily there is a way to tell between a real improvement and regression to the mean. That is the introduction of the so-called control group, which is expected to improve by regression alone. The aim of the research is to determine whether the treated group improve more than regression can explain.

In real life situations with the performance of specific individuals or teams, where the only real benchmark is the past performance and no control group can be introduced, the effects of regression can be difficult if not impossible to disentangle. We can compare against industry average, peers in the cohort group or historical rates of improvement, but none of these are perfect measures.

***

Luckily awareness of the regression to the mean phenomenon itself is already a great first step towards a more careful approach to understanding luck and performance.

If there is anything to be learned from the regression to the mean it is the importance of track records rather than relying on one-time success stories. I hope that the next time you come across an extreme quality in part governed by chance you will realize that the effects are likely to regress over time and will adjust your expectations accordingly.

What to Read Next

Do Something Syndrome: Why We Feel Compelled to Act

Petronius_Arbiter_by_Bodart_1707

“We trained hard, but it seemed that every time we were beginning to form into teams we would be
reorganized. I was to learn later in life that we tend to meet any new situation by reorganizing,
and what a wonderful method it can be for creating the illusion of progress
while producing confusion, inefficiency, and demoralization.”

— Roman satirist Petronius Arbiter

***

I was flipping back through one of my favorite books, Seeking Wisdom, when I came across this quote in the section where Bevelin talks about ‘Do Something Syndrome.’

There is something almost poetic in the way that Petronius (27 AD — 66 AD) so succinctly captures a phenomenon that most of us have been through.

There are numerous reasons why someone may choose action over the more logical course of inaction — some conscious and others subconscious. We may, for instance, act on bad advice when we haven’t done the work to understand a problem, we may succumb to peer pressure, the idea that ‘everyone is doing it,’ we may follow our hearts (and buy that fancy car we really want instead of keeping the reliable one we have), blindly follow the lead of an expert, or, perhaps most dangerously, we may simply want to appear like we are doing something.

Maybe we just can’t sit still. This idea isn’t new.

We all have moments where we fall victim to the curse of Do Something Syndrome. In fact, the modern organization is full of do something syndrome. The key is to try and realize when we are doing it and back away.

So next time you feel the urge to do something for the sake of doing something remember what Thoreau said: “It is not enough to be busy; so are the ants. The question is: What are we busy about?”

Don’t confuse activity and results. They are not the same thing. There is no point working hard at something you shouldn’t be doing in the first place.

One of the great advantages in decision making is the ability not to do something just to be active.

Still curious? Most of what you’re going to do today is not essential.

Charlie Munger: Adding Mental Models to Your Mind’s Toolbox

In The Art of War Sun Tzu said “The general who wins a battle makes many calculations in his temple before the battle is fought.”

Those ‘calculations’ are the tools we have available to think better. One of the best questions you can ask is how we can make our mental processes work better.

Charlie Munger says that “developing the habit of mastering the multiple models which underlie reality is the best thing you can do.”

Those models are mental models.

They fall into two categories: (1) ones that help us simulate time (and predict the future) and better understand how the world works (e.g. understanding a useful idea  autocatalysis), and (2) ones that help us better understand how our mental processes lead us astray (e.g., availability bias).

When our mental models line up with reality they help us avoid problems. However, they also cause problems when they don’t line up with reality as we think something that isn’t true.

Your Mind’s Toolbox

In Peter Bevelin’s Seeking Wisdom, he highlights Munger talking about autocatalysis:

If you get a certain kind of process going in chemistry, it speeds up on its own. So you get this marvellous boost in what you’re trying to do that runs on and on. Now, the laws of physics are such that it doesn’t run on forever. But it runs on for a goodly while. So you get a huge boost. You accomplish A – and, all of a sudden, you’re getting A + B + C for awhile.

He continues telling us how this idea can be applied:

Disney is an amazing example of autocatalysis … They had those movies in the can. They owned the copyright. And just as Coke could prosper when refrigeration came, when the videocassette was invented, Disney didn’t have to invent anything or do anything except take the thing out of the can and stick it on the cassette.

***

This leads us to an interesting problem. The world is always changing so which models should we prioritize learning?

How we prioritize our learning has implications beyond the day-to-day. Often we focus on things that change quickly. We chase the latest study, the latest findings, the most recent best-sellers. We do this to keep up-to-date with the latest-and-greatest.

Despite our intentions, learning in this way fails to account for cumulative knowledge. Instead, we consume all of our time keeping up to date.

If we are prioritize learning, we should focus on things that change slowly.

The models that come from hard science and engineering are the most reliable models on this Earth. And engineering quality control – at least the guts of it that matters to you and me and people who are not professional engineers – is very much based on the elementary mathematics of Fermat and Pascal: It costs so much and you get so much less likelihood of it breaking if you spend this much…

And, of course, the engineering idea of a backup system is a very powerful idea. The engineering idea of breakpoints – that’s a very powerful model, too. The notion of a critical mass – that comes out of physics – is a very powerful model.

After we learn a model we have to make it useful. We have to integrate it into our existing knowledge.

Our world is mutli-dimensional and our problems are complicated. Most problems cannot be solved using one model alone. The more models we have the better able we are to rationally solve problems.

But if we don’t have the models we become the proverbial man with a hammer. To the man with a hammer, everything looks like a nail. If you only have one model you will fit whatever problem you face to the model you have. If you have more than one model, however, you can look at the problem from a variety of perspectives and increase the odds you come to a better solution.

“Since no single discipline has all the answers,” Peter Bevelin writes in Seeking Wisdom, “we need to understand and use the big ideas from all the important disciplines: Mathematics, physics, chemistry, engineering, biology, psychology, and rank and use them in order of reliability.”

Charles Munger illustrates the importance of this:

Suppose you want to be good at declarer play in contract bridge. Well, you know the contract – you know what you have to achieve. And you can count up the sure winners you have by laying down your high cards and your invincible trumps.

But if you’re a trick or two short, how are you going to get the other needed tricks? Well, there are only six or so different, standard methods: You’ve got long-suit establishment. You’ve got finesses. You’ve got throw-in plays. You’ve got cross-ruffs. You’ve got squeezes. And you’ve got various ways of misleading the defense into making errors. So it’s a very limited number of models. But if you only know one or two of those models, then you’re going to be a horse’s patoot in declarer play…

If you don’t have the full repertoire, I guarantee you that you’ll overutilize the limited repertoire you have – including use of models that are inappropriate just because they’re available to you in the limited stock you have in mind.

As for how we can use different ideas, Munger again shows the way …

Have a full kit of tools … go through them in your mind checklist-style.. .you can never make any explanation that can be made in a more fundamental way in any other way than the most fundamental way. And you always take with full attribution to the most fundamental ideas that you are required to use. When you’re using physics, you say you’re using physics. When you’re using biology, you say you’re using biology.

But ideas alone are not enough. We need to understand how they interact and combine. This leads to lollapalooza effects.

You get lollapalooza effects when two, three or four forces are all operating in the same direction. And, frequently, you don’t get simple addition. It’s often like a critical mass in physics where you get a nuclear explosion if you get to a certain point of mass – and you don’t get anything much worth seeing if you don’t reach the mass.

Sometimes the forces just add like ordinary quantities and sometimes they combine on a break-point or critical-mass basis … More commonly, the forces coming out of … models are conflicting to some extent. And you get huge, miserable trade-offs … So you [must] have the models and you [must] see the relatedness and the effects from the relatedness.

Peter Bevelin: A Few Lessons From Sherlock Holmes

“Peter Bevelin is one of the wisest people on the planet.”
Nassim Taleb

***

Peter Bevelin‘s first book, Seeking Wisdom from Darwin to Munger, is a one of the best books you’ve never heard of. He’s just another book, A Few Lessons from Sherlock Holmes (Kindle), aimed at those who want to improve their thinking.

I’m a big fan of Sherlock Holmes and Peter is not the first person to explore the wisdom that can be drawn.

Maria Konnikova’s book, Mastermind: How To Think Like Sherlock Holmes, takes a deep look at Sherlock Holmes’s methodology to develop the habits of mind that will allow us to mindfully engage the world.

Sherlock Holmes

Peter’s book is shorter and encourages you to draw your own conclusions. He’s distilled Arthur Conan Doyle’s famous detective Sherlock Holmes into principles and quotes.

This book will appeal to both Sherlock fans as well as those who want to think better. It contains useful and timeless methods and questions applicable to a variety of important issues in life and business. We could all benefit from A few lessons from Sherlock Holmes.

Let’s look at some of the lessons Bevelin brings to our attention.

“What distinguishes Holmes from most mortals,” Bevelin writes, “is that he knows where to look and what questions to ask. He pays attention to the important things and he knows where to find them.”

Many ideas over a wide range of disciplines helps us gain perspective.

Breadth of view, my dear Mr. Mac, is one of the essentials of our profession. The interplay of ideas and the oblique uses of knowledge are often of extraordinary interest.(Holmes; The Valley of Fear)

The memory attic.

I consider that a man’s brain originally is like a little empty attic, and you have to stock it with such furniture as you choose. A fool takes in all the lumber of every sort that he comes across, so that the knowledge which might be useful to him gets crowded out, or at best is jumbled up with a lot of other things so that he has a difficulty in laying his hands upon it.

Now the skillful workman is very careful indeed as to what he takes into his brain-attic. He will have nothing but the tools which may help him in doing his work, but of these he has a large assortment, and all in the most perfect order. It is a mistake to think that that little room has elastic walls and can distend to any extent. Depend upon it there comes a time when for every addition of knowledge you forget something that you knew before. It is of the highest importance, therefore, not to have useless facts elbowing out the useful ones. (Holmes; A Study in Scarlet)

So says the statistician.

You can, for example, never foretell what any one man will do, but you can say with precision what an average number will be up to. Individuals vary, but percentages remain constant. (Holmes; The Sign of the Fear)

Knowledge doesn’t make us wise.
One of the best things about Peter is how he adds outsiders to the mix. He inserts this quote from Montaigne:

Judgment can do without knowledge but not knowledge without judgment. (Montaigne)

Never jump to conclusions.

I have not all my facts yet, but I do not think there are any insuperable difficulties. Still, it is an error to argue in front of your data. You find yourself insensibly twisting them round to fit your theories. (Holmes; Wisteria Lodge)

Don’t theorize before data.

It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts. (Holmes; A Scandal in Bohemia)

Make sure facts are facts.

I realize that if you ask people to account for “facts”, they usually spend more time finding reasons for them than finding out whether they are true. … They skip over the facts but carefully deduce inferences. They normally begin thus: “How does this come about?” But does it do so? That is what they ought to be asking. (Montaigne)

Don’t miss the forest for the trees.

The principal difficulty in your case … lay in the fact of there being too much evidence. What was vital was overlaid and hidden by what was irrelevant. Of all the facts which were presented to us we had to pick just those which we deemed to be essential, and then piece them together in their order, so as to reconstruct this very remarkable chain of events. (Holmes; The Naval Treaty)

Small things may be important.

The smallest point may be the most essential. (Holmes; A Study in Scarlet)

What we see.

What we see is all we think is there — What often leads us astray in an investigation is that we adopt the theory which is most likely to account for the “visible” and found facts but what if the important is left out? What is not reported, withheld, hidden?

Take time to think things over.

Sherlock Holmes was a man … who, when he had an unsolved problem upon his mind, would go for days, and even for a week, without rest, turning it over, rearranging his facts, looking at it from every point of view until he had either fathomed it or convinced himself that his data were insufficient. (Dr. Watson; The Man with the Twisted Lip)

***

Peter’s books tend to become very hard to find a few months after they are released. Used editions often sell well above cover price, so if you’re interested, I’d encourage you to order A Few Lessons From Sherlock Holmes today.

12