Category: Thinking

A Few Useful Mental Tools from Richard Feynman

We’ve covered the brilliant physicist Richard Feynman (1918-1988) many times here before. He was a genius. A true genius. But there have been many geniuses — physics has been fortunate to attract some of them — and few of them are as well known as Feynman. Why is Feynman so well known? It’s likely because he had tremendous range outside of pure science, and although he won a Nobel Prize for his work in quantum mechanics, he’s probably best known for other things, primarily his wonderful ability to explain and teach.

This ability was on display in a series of non-technical lectures in 1963, memorialized in a short book called The Meaning of it All: Thoughts of a Citizen Scientist. The lectures are a wonderful example of how well Feynman’s brain worked outside of physics, talking through basic reasoning and some of the problems of his day.

Particularly useful are a series of “tricks of the trade” he gives in a section called This Unscientific Age. These tricks show Feynman taking the method of thought he learned in pure science and applying it to the more mundane topics most of us have to deal with every day. They’re wonderfully instructive. Let’s check them out.

Mental Tools from Richard Feynman

Before we start, it’s worth noting that Feynman takes pains to mention that not everything needs to be considered with scientific accuracy. So don’t waste your time unless it’s a scientific matter. So let’s start with a deep breath:

Now, that there are unscientific things is not my grief. That’s a nice word. I mean, that is not what I am worrying about, that there are unscientific things. That something is unscientific is not bad; there is nothing the matter with it. It is just unscientific. And scientific is limited, of course, to those things that we can tell about by trial and error. For example, there is the absurdity of the young these days chanting things about purple people eaters and hound dogs, something that we cannot criticize at all if we belong to the old flat foot floogie and a floy floy or the music goes down and around. Sons of mothers who sang about “come, Josephine, in my flying machine,” which sounds just about as modern as “I’d like to get you on a slow boat to China.” So in life, in gaiety, in emotion, in human pleasures and pursuits, and in literature and so on, there is no need to be scientific, there is no reason to be scientific. One must relax and enjoy life. That is not the criticism. That is not the point.

As we enter the realm of “knowable” things in a scientific sense, the first trick has to do with deciding whether someone truly knows their stuff or is mimicking:

The first one has to do with whether a man knows what he is talking about, whether what he says has some basis or not. And my trick that I use is very easy. If you ask him intelligent questions—that is, penetrating, interested, honest, frank, direct questions on the subject, and no trick questions—then he quickly gets stuck. It is like a child asking naive questions. If you ask naive but relevant questions, then almost immediately the person doesn’t know the answer, if he is an honest man. It is important to appreciate that.

And I think that I can illustrate one unscientific aspect of the world which would be probably very much better if it were more scientific. It has to do with politics. Suppose two politicians are running for president, and one goes through the farm section and is asked, “What are you going to do about the farm question?” And he knows right away— bang, bang, bang.

Now he goes to the next campaigner who comes through. “What are you going to do about the farm problem?” “Well, I don’t know. I used to be a general, and I don’t know anything about farming. But it seems to me it must be a very difficult problem, because for twelve, fifteen, twenty years people have been struggling with it, and people say that they know how to solve the farm problem. And it must be a hard problem. So the way that I intend to solve the farm problem is to gather around me a lot of people who know something about it, to look at all the experience that we have had with this problem before, to take a certain amount of time at it, and then to come to some conclusion in a reasonable way about it. Now, I can’t tell you ahead of time what conclusion, but I can give you some of the principles I’ll try to use—not to make things difficult for individual farmers, if there are any special problems we will have to have some way to take care of them,” etc., etc., etc.

That’s a wonderfully useful way to figure out whether someone is Max Planck or the chauffeur.

The second trick regards how to deal with uncertainty:

People say to me, “Well, how can you teach your children what is right and wrong if you don’t know?” Because I’m pretty sure of what’s right and wrong. I’m not absolutely sure; some experiences may change my mind. But I know what I would expect to teach them. But, of course, a child won’t learn what you teach him.

I would like to mention a somewhat technical idea, but it’s the way, you see, we have to understand how to handle uncertainty. How does something move from being almost certainly false to being almost certainly true? How does experience change? How do you handle the changes of your certainty with experience? And it’s rather complicated, technically, but I’ll give a rather simple, idealized example.

You have, we suppose, two theories about the way something is going to happen, which I will call “Theory A” and “Theory B.” Now it gets complicated. Theory A and Theory B. Before you make any observations, for some reason or other, that is, your past experiences and other observations and intuition and so on, suppose that you are very much more certain of Theory A than of Theory B—much more sure. But suppose that the thing that you are going to observe is a test. According to Theory A, nothing should happen. According to Theory B, it should turn blue. Well, you make the observation, and it turns sort of a greenish. Then you look at Theory A, and you say, “It’s very unlikely,” and you turn to Theory B, and you say, “Well, it should have turned sort of blue, but it wasn’t impossible that it should turn sort of greenish color.” So the result of this observation, then, is that Theory A is getting weaker, and Theory B is getting stronger. And if you continue to make more tests, then the odds on Theory B increase. Incidentally, it is not right to simply repeat the same test over and over and over and over, no matter how many times you look and it still looks greenish, you haven’t made up your mind yet. But if you find a whole lot of other things that distinguish Theory A from Theory B that are different, then by accumulating a large number of these, the odds on Theory B increase.

Feynman is talking about Grey Thinking here, the ability to put things on a gradient from “probably true” to “probably false” and how we deal with that uncertainty. He isn’t proposing a method of figuring out absolute, doctrinaire truth.

Another term for what he’s proposing is Bayesian updating — starting with a priori odds, based on earlier understanding, and “updating” the odds of something based on what you learn thereafter. An extremely useful tool.

Feynman’s third trick is the realization that as we investigate whether something is true or not, new evidence and new methods of experimentation should show the effect of getting stronger and stronger, not weaker. He uses an excellent example here by analyzing mental telepathy:

I give an example. A professor, I think somewhere in Virginia, has done a lot of experiments for a number of years on the subject of mental telepathy, the same kind of stuff as mind reading. In his early experiments the game was to have a set of cards with various designs on them (you probably know all this, because they sold the cards and people used to play this game), and you would guess whether it’s a circle or a triangle and so on while someone else was thinking about it. You would sit and not see the card, and he would see the card and think about the card and you’d guess what it was. And in the beginning of these researches, he found very remarkable effects. He found people who would guess ten to fifteen of the cards correctly, when it should be on the average only five. More even than that. There were some who would come very close to a hundred percent in going through all the cards. Excellent mind readers.

A number of people pointed out a set of criticisms. One thing, for example, is that he didn’t count all the cases that didn’t work. And he just took the few that did, and then you can’t do statistics anymore. And then there were a large number of apparent clues by which signals inadvertently, or advertently, were being transmitted from one to the other.

Various criticisms of the techniques and the statistical methods were made by people. The technique was therefore improved. The result was that, although five cards should be the average, it averaged about six and a half cards over a large number of tests. Never did he get anything like ten or fifteen or twenty-five cards. Therefore, the phenomenon is that the first experiments are wrong. The second experiments proved that the phenomenon observed in the first experiment was nonexistent. The fact that we have six and a half instead of five on the average now brings up a new possibility, that there is such a thing as mental telepathy, but at a much lower level. It’s a different idea, because, if the thing was really there before, having improved the methods of experiment, the phenomenon would still be there. It would still be fifteen cards. Why is it down to six and a half? Because the technique improved. Now it still is that the six and a half is a little bit higher than the average of statistics, and various people criticized it more subtly and noticed a couple of other slight effects which might account for the results.

It turned out that people would get tired during the tests, according to the professor. The evidence showed that they were getting a little bit lower on the average number of agreements. Well, if you take out the cases that are low, the laws of statistics don’t work, and the average is a little higher than the five, and so on. So if the man was tired, the last two or three were thrown away. Things of this nature were improved still further. The results were that mental telepathy still exists, but this time at 5.1 on the average, and therefore all the experiments which indicated 6.5 were false. Now what about the five? . . . Well, we can go on forever, but the point is that there are always errors in experiments that are subtle and unknown. But the reason that I do not believe that the researchers in mental telepathy have led to a demonstration of its existence is that as the techniques were improved, the phenomenon got weaker. In short, the later experiments in every case disproved all the results of the former experiments. If remembered that way, then you can appreciate the situation.

This echoes Feyman’s dictum about not fooling oneself: We must refine our process for probing and experimenting if we’re to get at real truth, always watching out for little troubles. Otherwise, we torture the world so that results fit our expectations. If we carefully refine and re-test and the effect gets weaker all the time, it’s likely to not be true, or at least not to the magnitude originally hoped for.

The fourth trick is to ask the right question, which is not “Could this be the case?” but “Is this actually the case?” Many get so caught up with the former that they forget to ask the latter:

That brings me to the fourth kind of attitude toward ideas, and that is that the problem is not what is possible. That’s not the problem. The problem is what is probable, what is happening. It does no good to demonstrate again and again that you can’t disprove that this could be a flying saucer. We have to guess ahead of time whether we have to worry about the Martian invasion. We have to make a judgment about whether it is a flying saucer, whether it’s reasonable, whether it’s likely. And we do that on the basis of a lot more experience than whether it’s just possible, because the number of things that are possible is not fully appreciated by the average individual. And it is also not clear, then, to them how many things that are possible must not be happening. That it’s impossible that everything that is possible is happening. And there is too much variety, so most likely anything that you think of that is possible isn’t true. In fact that’s a general principle in physics theories: no matter what a guy thinks of, it’s almost always false. So there have been five or ten theories that have been right in the history of physics, and those are the ones we want. But that doesn’t mean that everything’s false. We’ll find out.

The fifth trick is a very, very common one, even 50 years after Feynman pointed it out. You cannot judge the probability of something happening after it’s already happened. That’s cherry-picking. You have to run the experiment forward for it to mean anything:

I now turn to another kind of principle or idea, and that is that there is no sense in calculating the probability or the chance that something happens after it happens. A lot of scientists don’t even appreciate this. In fact, the first time I got into an argument over this was when I was a graduate student at Princeton, and there was a guy in the psychology department who was running rat races. I mean, he has a T-shaped thing, and the rats go, and they go to the right, and the left, and so on. And it’s a general principle of psychologists that in these tests they arrange so that the odds that the things that happen happen by chance is small, in fact, less than one in twenty. That means that one in twenty of their laws is probably wrong. But the statistical ways of calculating the odds, like coin flipping if the rats were to go randomly right and left, are easy to work out.

This man had designed an experiment which would show something which I do not remember, if the rats always went to the right, let’s say. I can’t remember exactly. He had to do a great number of tests, because, of course, they could go to the right accidentally, so to get it down to one in twenty by odds, he had to do a number of them. And its hard to do, and he did his number. Then he found that it didn’t work. They went to the right, and they went to the left, and so on. And then he noticed, most remarkably, that they alternated, first right, then left, then right, then left. And then he ran to me, and he said, “Calculate the probability for me that they should alternate, so that I can see if it is less than one in twenty.” I said, “It probably is less than one in twenty, but it doesn’t count.”

He said, “Why?” I said, “Because it doesn’t make any sense to calculate after the event. You see, you found the peculiarity, and so you selected the peculiar case.”

For example, I had the most remarkable experience this evening. While coming in here, I saw license plate ANZ 912. Calculate for me, please, the odds that of all the license plates in the state of Washington I should happen to see ANZ 912. Well, it’s a ridiculous thing. And, in the same way, what he must do is this: The fact that the rat directions alternate suggests the possibility that rats alternate. If he wants to test this hypothesis, one in twenty, he cannot do it from the same data that gave him the clue. He must do another experiment all over again and then see if they alternate. He did, and it didn’t work.

The sixth trick is one that’s familiar to almost all of us, yet almost all of us forget about every day: The plural of anecdote is not data. We must use proper statistical sampling to know whether or not we know what we’re talking about:

The next kind of technique that’s involved is statistical sampling. I referred to that idea when I said they tried to arrange things so that they had one in twenty odds. The whole subject of statistical sampling is somewhat mathematical, and I won’t go into the details. The general idea is kind of obvious. If you want to know how many people are taller than six feet tall, then you just pick people out at random, and you see that maybe forty of them are more than six feet so you guess that maybe everybody is. Sounds stupid.

Well, it is and it isn’t. If you pick the hundred out by seeing which ones come through a low door, you’re going to get it wrong. If you pick the hundred out by looking at your friends you’ll get it wrong because they’re all in one place in the country. But if you pick out a way that as far as anybody can figure out has no connection with their height at all, then if you find forty out of a hundred, then, in a hundred million there will be more or less forty million. How much more or how much less can be worked out quite accurately. In fact, it turns out that to be more or less correct to 1 percent, you have to have 10,000 samples. People don’t realize how difficult it is to get the accuracy high. For only 1 or 2 percent you need 10,000 tries.

The last trick is to realize that many errors people make simply come from lack of information. They don’t even know they’re missing the tools they need. This can be a very tough one to guard against — it’s hard to know when you’re missing information that would change your mind — but Feynman gives the simple case of astrology to prove the point:

Now, looking at the troubles that we have with all the unscientific and peculiar things in the world, there are a number of them which cannot be associated with difficulties in how to think, I think, but are just due to some lack of information. In particular, there are believers in astrology, of which, no doubt, there are a number here. Astrologists say that there are days when it’s better to go to the dentist than other days. There are days when it’s better to fly in an airplane, for you, if you are born on such a day and such and such an hour. And it’s all calculated by very careful rules in terms of the position of the stars. If it were true it would be very interesting. Insurance people would be very interested to change the insurance rates on people if they follow the astrological rules, because they have a better chance when they are in the airplane. Tests to determine whether people who go on the day that they are not supposed to go are worse off or not have never been made by the astrologers. The question of whether it’s a good day for business or a bad day for business has never been established. Now what of it? Maybe it’s still true, yes.

On the other hand, there’s an awful lot of information that indicates that it isn’t true. Because we have a lot of knowledge about how things work, what people are, what the world is, what those stars are, what the planets are that you are looking at, what makes them go around more or less, where they’re going to be in the next 2000 years is completely known. They don’t have to look up to find out where it is. And furthermore, if you look very carefully at the different astrologers they don’t agree with each other, so what are you going to do? Disbelieve it. There’s no evidence at all for it. It’s pure nonsense.

The only way you can believe it is to have a general lack of information about the stars and the world and what the rest of the things look like. If such a phenomenon existed it would be most remarkable, in the face of all the other phenomena that exist, and unless someone can demonstrate it to you with a real experiment, with a real test, took people who believe and people who didn’t believe and made a test, and so on, then there’s no point in listening to them.

***

Still Interested? Check out the (short) book: The Meaning of it All: Thoughts of a Citizen-Scientist.

Richard Feynman on Teaching Math to Kids and the Lessons of Knowledge

Legendary scientist Richard Feynman (1918-1988) was famous for his penetrating insight and clarity of thought. Famous for not only the work he did to garner a Nobel Prize, but also for the lucidity of explanations of ordinary things such as why trains stay on the tracks as they go around a curve, how we look for new laws of science, how rubber bands work, and .

Feynman knew the difference between knowing the name of something and knowing something. And was often prone to telling the emperor they had no clothes as this illuminating example from James Gleick’s book Genius: The Life and Science of Richard Feynman shows.

Educating his children gave him pause as to how the elements of teaching should be employed. By the time his son Carl was four, Feynman was “actively lobbying against a first-grade science book proposed for California schools.”

It began with pictures of a mechanical wind-up dog, a real dog, and a motorcycle, and for each the same question: “What makes it move?” The proposed answer—“ Energy makes it move”— enraged him.

That was tautology, he argued—empty definition. Feynman, having made a career of understanding the deep abstractions of energy, said it would be better to begin a science course by taking apart a toy dog, revealing the cleverness of the gears and ratchets. To tell a first-grader that “energy makes it move” would be no more helpful, he said, than saying “God makes it move” or “moveability makes it move.”

Feynman proposed a simple test for whether one is teaching ideas or mere definitions: “Without using the new word which you have just learned, try to rephrase what you have just learned in your own language. Without using the word energy, tell me what you know now about the dog’s motion.”

The other standard explanations were equally horrible: gravity makes it fall, or friction makes it wear out. You didn’t get a pass on learning because you were a first-grader and Feynman’s explanations not only captured the attention of his audience—from Nobel winners to first-graders—but also offered true knowledge. “Shoe leather wears out because it rubs against the sidewalk and the little notches and bumps on the sidewalk grab pieces and pull them off.” That is knowledge. “To simply say, ‘It is because of friction,’ is sad, because it’s not science.”

Richard Feynman on Teaching

Choosing Textbooks for Grade Schools

In 1964 Feynman made the rare decision to serve on a public commission for choosing mathematics textbooks for California’s grade schools. As Gleick describes it:

Traditionally this commissionership was a sinecure that brought various small perquisites under the table from textbook publishers. Few commissioners— as Feynman discovered— read many textbooks, but he determined to read them all, and had scores of them delivered to his house.

This was the era of new math in children’s textbooks: introducing high-level concepts, such as set theory and non decimal number systems into grade school.

Feynman was skeptical of this approach but rather than simply let it go, he popped the balloon.

He argued to his fellow commissioners that sets, as presented in the reformers’ textbooks, were an example of the most insidious pedantry: new definitions for the sake of definition, a perfect case of introducing words without introducing ideas.

A proposed primer instructed first-graders: “Find out if the set of the lollipops is equal in number to the set of the girls.”

To Feynman this was a disease. It confused without adding precision to the normal sentence: “Find out if there are just enough lollipops for the girls.”

According to Feynman, specialized language should wait until it is needed. (In case you’re wondering, he argued the peculiar language of set theory is rarely, if ever, needed —only in understanding different degrees of infinity—which certainly wasn’t necessary at a grade-school level.)

Feynman convincingly argued this was knowledge of words without actual knowledge. He wrote:

It is an example of the use of words, new definitions of new words, but in this particular case a most extreme example because no facts whatever are given…. It will perhaps surprise most people who have studied this textbook to discover that the symbol ∪ or ∩ representing union and intersection of sets … all the elaborate notation for sets that is given in these books, almost never appear in any writings in theoretical physics, in engineering, business, arithmetic, computer design, or other places where mathematics is being used.

The point became philosophical.

It was crucial, he argued, to distinguish clear language from precise language. The textbooks placed a new emphasis on precise language: distinguishing “number” from “numeral,” for example, and separating the symbol from the real object in the modern critical fashion— pupil for schoolchildren, it seemed to Feynman. He objected to a book that tried to teach a distinction between a ball and a picture of a ball— the book insisting on such language as “color the picture of the ball red.”

“I doubt that any child would make an error in this particular direction,” Feynman said, adding:

As a matter of fact, it is impossible to be precise … whereas before there was no difficulty. The picture of a ball includes a circle and includes a background. Should we color the entire square area in which the ball image appears all red? … Precision has only been pedantically increased in one particular corner when there was originally no doubt and no difficulty in the idea.

In the real world absolute precision can never be reached and the search for degrees of precision that are not possible (but are desirable) causes a lot of folly.

Feynman has his own ideas for teaching children mathematics.

***

Process vs. Outcome

Feynman proposed that first-graders learn to add and subtract more or less the way he worked out complicated integrals— free to select any method that seems suitable for the problem at hand.A modern-sounding notion was, The answer isn’t what matters, so long as you use the right method. To Feynman no educational philosophy could have been more wrong. The answer is all that does matter, he said. He listed some of the techniques available to a child making the transition from being able to count to being able to add. A child can combine two groups into one and simply count the combined group: to add 5 ducks and 3 ducks, one counts 8 ducks. The child can use fingers or count mentally: 6, 7, 8. One can memorize the standard combinations. Larger numbers can be handled by making piles— one groups pennies into fives, for example— and counting the piles. One can mark numbers on a line and count off the spaces— a method that becomes useful, Feynman noted, in understanding measurement and fractions. One can write larger numbers in columns and carry sums larger than 10.

To Feynman the standard texts were flawed. The problem

29
+3

was considered a third-grade problem because it involved the concept of carrying. However, Feynman pointed out most first-graders could easily solve this problem by counting 30, 31, 32.

He proposed that kids be given simple algebra problems (2 times what plus 3 is 7) and be encouraged to solve them through the scientific method, which is tantamount to trial and error. This, he argued, is what real scientists do.

“We must,” Feynman said, “remove the rigidity of thought.” He continued “We must leave freedom for the mind to wander about in trying to solve the problems…. The successful user of mathematics is practically an inventor of new ways of obtaining answers in given situations. Even if the ways are well known, it is usually much easier for him to invent his own way— a new way or an old way— than it is to try to find it by looking it up.”

It was better in the end to have a bag of tricks at your disposal that could be used to solve problems than one orthodox method. Indeed, part of Feynman’s genius was his ability to solve problems that were baffling others because they were using the standard method to try and solve them. He would come along and approach the problem with a different tool, which often led to simple and beautiful solutions.

***

If you give some thought to how Farnam Street helps you, one of the ways is by adding to your bag of tricks so that you can pull them out when you need them to solve problems. We call these tricks mental models and they work kinda like lego — interconnecting and reinforcing one another. The more pieces you have, the more things you can build.

Complement this post with Feynman’s excellent advice on how to learn anything.

Homeostasis and Why We Backslide

At some time or another, we’ve all sought to make big changes. And almost of all of us have, after making grand plans, discovered that changing some aspect of our lives or organizations, whether adding in a new skill or simply changing an old process, resulted in great backsliding.

Why the disconnect?

As George Leonard discusses in his classic book Mastery, based on his experiences in the patient lifelong practice of Aikido, it’s not necessary to beat ourselves up or derive a complicated psychological explanation.

The problem is due to a very simple mental model that explains how systems are regulated through feedback loopsHomeostasis.

Backsliding is a universal experience. Every one of us resists significant change, no matter whether it’s for the worse or for the better. Our body, brain, and behavior have a built-in tendency to stay the same within rather narrow limits, and to snap back when changed—and it’s a very good thing they do. Just think about it: if your body temperature moved up or down by 10 percent, you’d be in big trouble. The same thing applies to your blood-sugar level and to any number of other functions of your body.

This condition of equilibrium, this resistance to change, is called homeostasis. It characterizes all self-regulating systems, from a bacterium to a frog to a human individual to a family to an organization to an entire culture—and it applies to psychological states and behavior as well as to physical functioning.

The simplest example of homeostasis can be found in your home heating system. The thermostat on the wall senses the room temperature; when the temperature on a winter’s day drops below the level you’ve set, the thermostat sends an electrical signal that turns the heater on. The heater completes the loop by sending heat to the room in which the thermostat is located. When the room temperature reaches the level you’ve set, the thermostat sends an electrical signal back to the heater, turning it off, thus maintaining homeostasis. Keeping a room at the right temperature takes only one feedback loop. Keeping even the simplest single-celled organism alive and well takes thousands. And maintaining a human being in a state of homeostasis takes billions of interweaving electrochemical signals pulsing in the brain, rushing along nerve fibers, coursing through the bloodstream. One example: each of us has about 150,000 tiny thermostats in the form of nerve endings close to the surface of the skin that are sensitive to the loss of heat from our bodies, and another sixteen thousand or so a little deeper in the skin that alert us to the entry of heat from without.

An even more sensitive thermostat resides in the hypothalamus at the base of the brain, close to branches of the main artery that brings blood from the heart to the head. This thermostat can pick up even the tiniest change of temperature in the blood. When you start getting cold, these thermostats signal the sweat glands, pores, and small blood vessels near the surface of the body to close down. Glandular activity and muscle tension cause you to shiver in order to produce more heat, and your senses send a very clear message to your brain, leading you to keep moving, to put on more clothes, to cuddle closer to someone, to seek shelter, or to build a fire.

Homestasis seems to be the rule when it comes to systems, yet we often forget about it, or think we’re not subject to a simple law of nature. But we needn’t totally despair. Homeostasis is often quite positive, and it keeps systems alive and well. Our bodies wouldn’t work without it, nor would our social systems.

Homeostasis in social groups brings additional feedback loops into play. Families stay stable by means of instruction, exhortation, punishment, privileges, gifts, favors, signs of approval and affection, and even by means of extremely subtle body language and facial expressions. Social groups larger than the family add various types of feedback systems. A national culture, for example, is held together by the legislative process, law enforcement, education, the popular arts, sports and games, economic rewards that favor certain types of activity, and by a complex web of mores, prestige markers, celebrity role modeling, and style that relies largely on the media as a national nervous system. Although we might think that our culture is mad for the new, the predominant function of all this—as with the feedback loops in your body—is the survival of things as they are.

The problem is that homeostasis, like natural selection and like life itself, is undirected and does not have a “value system” — it doesn’t keep what’s good and reject what’s bad. It’s just like inertia: It’s a simple algorithim that keeps things in motion as they were.

Let’s say, for instance, that for the last twenty years—ever since high school, in fact—you’ve been almost entirely sedentary. Now most of your friends are working out, and you figure that if you can’t beat the fitness revolution, you’ll join it. Buying the tights and running shoes is fun, and so are the first few steps as you start jogging on the high school track near your house. Then, about a third of the way around the first lap, something terrible happens. Maybe you’re suddenly sick to your stomach. Maybe you’re dizzy. Maybe there’s a strange, panicky feeling in your chest. Maybe you’re going to die. No, you’re going to die.

What’s more, the particular sensations you’re feeling probably aren’t significant in themselves. What you’re really getting is a homeostatic alarm signal—bells clanging, lights flashing. Warning! Warning!  Significant changes in respiration, heart rate, metabolism. Whatever you’re doing, stop doing it immediately. Homeostasis, remember, doesn’t distinguish between what you would call change for the better and change for the worse. It resists all change. After twenty years without exercise, your body regards a sedentary style of life as “normal”; the beginning of a change for the better is interpreted as a threat. So you walk slowly back to your car, figuring you’ll look around for some other revolution to join.

Leonard does provide a few possible solutions, or at least an approach to the homeostasis problem. The good thing is that homeostasis isn’t all-powerful, it’s simply a force that we must work with. He offers five ways to approach the issue:

1. Be aware of the way homeostasis works. This might be the most important guideline of all. Expect resistance and backlash. Realize that when the alarm bells start ringing, it doesn’t necessarily mean you’re sick or crazy or lazy or that you’ve made a bad decision in embarking on the journey of mastery. In fact, you might take these signals as an indication that your life is definitely changing—just what you’ve wanted. Of course, it might be that you have started something that’s not right for you; only you can decide. But in any case, don’t panic and give up at the first sign of trouble. You might also expect resistance from friends and family and co-workers. (Homeostasis, as we’ve seen, applies to social systems as well as individuals.) Say you used to struggle out of bed at 7:30 and barely drag yourself to work at 9:00. Now that you’re on a path of mastery, you’re up at 6:00 for a three-mile run, and in the office, charged with energy, at 8:30. You might figure that your co-workers would be overjoyed, but don’t be too sure. And when you get home, still raring to go, do you think that your family will welcome the change? Maybe. Bear in mind that an entire system has to change when any part of it changes. So don’t be surprised if some of the people you love start covertly or overtly undermining your self-improvement. It’s not that they wish you harm, it’s just homeostasis at work.

2. Be willing to negotiate with your resistance to change. So what should you do when you run into resistance, when the red lights flash and the alarm bells ring? Well, you don’t back off, and you don’t bull your way through. Negotiation is the ticket to successful long-term change in everything from increasing your running speed to transforming your organization. The long-distance runner working for a faster time on a measured course negotiates with homeostasis by using pain not as an adversary but as the best possible guide to performance. The change oriented manager keeps his or her eyes and ears open for signs of dissatisfaction or dislocation, then plays the edge of discontent, the inevitable escort of transformation. The fine art of playing the edge in this case involves a willingness to take one step back for every two forward, sometimes vice versa. It also demands a determination to keep pushing, but not without awareness. Simply turning off your awareness to the warnings deprives you of guidance and risks damaging the system. Simply pushing your way through despite the warning signals increases the possibility of backsliding. You can never be sure exactly where the resistance will pop up. A feeling of anxiety? Psychosomatic complaints? A tendency toward self-sabotage? Squabbles with family, friends, or fellow workers? None of the above? Stay alert. Be prepared for serious negotiations.

3. Develop a support system. You can do it alone, but it helps a great deal to have other people with whom you can share the joys and perils of the change you’re making. The best support system would involve people who have gone through or are going through a similar process, people who can tell their own stories of change and listen to yours, people who will brace you up when you start to backslide and encourage you when you don’t. The path of mastery, fortunately, almost always fosters social groupings. In his seminal book Homo Ludens: A Study of the Play Element in Culture, Johan Huizinga comments upon the tendency of sports and games to bring people together. The play community, he points out, is likely to continue even after the game is over, inspired by “the feeling of being ‘apart together’ in an exceptional situation, of sharing something important, of mutually withdrawing from the rest of the world and rejecting the usual norms.” The same can be said about many other pursuits, whether or not they are formally known as sports—arts and crafts, hunting, fishing, yoga, Zen, the professions, “the office.” And what if your quest for mastery is a lonely one? What if you can find no fellow voyagers on that particular path? At the least, you can let the people close to you know what you’re doing, and ask for their support.

4. Follow a regular practice. People embarking on any type of change can gain stability and comfort through practicing some worthwhile activity on a more or less regular basis, not so much for the sake of achieving an external goal as simply for its own sake. A traveler on the path of mastery is again fortunate, for practice in this sense (as I’ve said more than once) is the foundation of the path itself. The circumstances are particularly happy in case you’ve already established a regular practice in something else before facing the challenge and change of beginning a new one. It’s easier to start applying the principles of mastery to your profession or your primary relationship if you’ve already established a regular morning exercise program. Practice is a habit, and any regular practice provides a sort of underlying homeostasis, a stable base during the instability of change.

5. Dedicate yourself to lifelong learning. We tend to forget that learning is much more than book learning. To learn is to change. Education, whether it involves books, body, or behavior, is a process that changes the learner. It doesn’t have to end at college graduation or at age forty or sixty or eighty, and the best learning of all involves learning how to learn— that is, to change. The lifelong learner is essentially one who has learned to deal with homeostasis, simply because he or she is doing it all the time. The Dabbler, Obsessive, and Hacker are all learners in their own fashion, but lifelong learning is the special province of those who travel the path of mastery, the path that never ends.

Still Interested? Check out the classic (short) book in its entirety: Mastery: The Keys to Success and Long-Term Fulfillment.

The Value of Grey Thinking

One of the most common questions we receive, unsurprisingly, is along the lines of What one piece of advice would you recommend to become a better thinker?

The question is kind of cheating. There is, of course, no one thing, and if Farnam Street is a testament to any idea, it’s that you must pull from many disciplines to achieve overall wisdom. No truly great thinker is siloed in a small territory.

But a common experience tends to occur as you rid yourself of ideology and narrowness, as you venture deeper and deeper into unfamiliar territory; and it’s worth thinking about it ahead of time. It goes by many names, but a fair one might be Grey Thinking.

Thinking in Grey

Children love torturing their parents and teachers with the relentless Why? The chain of whys can be endless — Why does the doggy pant? He’s hot. Why? I’m hot and I don’t pant. Yes, but he has fur, and doesn’t sweat. Why does he have fur? To keep him warm. Why don’t I have fur then? OK that’s enough. 

If you’re a parent, you’ve probably had this experience. It’s agitating in the moment, but it’s just a symptom of the child’s view of the world: Something to be explored. Their views are not fixed yet.

As we get older, we start to get rigid. We are forced to take tests with definite answers — A, B, C, or D? How well we do at these determines, to an extent, our position in life. The shortcomings of this system are well documented so we won’t rehash them. But a major symptom of this style of learning, combined with our natural proclivity to land on easily digestible answers, is that we start thinking in rigid categories: War is good. War is bad. Capitalism is good. Capitalism is bad. America is Socialist. America is a Free Market System. We must support our troops. College is useless. College is indispensable.

And so on. These slogans become substitutes for actual understanding, and it’s not as benign as it seems. The slogan isn’t just a shorthand: It replaces thinking for many people, because it’s hard to generate real understanding. As discussed in the Eager to be Wrong piece, it’s a lot easier to land somewhere simple and stay there. It requires less energy.

But the fact is, the reality is all grey area. All of it. There are very few black and white answers and no solutions without second-order consequences.

This fundamental truth is easy to grasp in theory and hard to use in practice, every day. It takes a substantial deprogramming to realize that life is all grey, that all reality lies on a continuum. This is why quantitative and scale-based thinking is so important. But most don’t realize that quantitative thinking isn’t really about math; it’s about the idea that The dose makes the poison. 

The dose/poison idea is the opposite of the slippery slope argument favored by the ideologue. It starts with this, and then the whole thing goes to hell. Well, maybe, but not necessarily and not usually. Nearly all things are OK in some dose but not OK in another dose. That is the way of the world, and why almost everything connected to practical reality must be quantified, at least roughly.

This isn’t to say that some things shouldn’t be stamped on hard, and fast. Doing heroin even once is probably a bad idea. But make sure to use the right mental model for the right situation. We can re-frame our slogans above: War is awful but history shows it to be occasionally necessary, and a very complex phenomenon. Capitalism is enormously productive but has many limitations. Some socialist institutions actually work well in a capitalist economy, but pure socialism hasn’t tended to work at all. College has its pluses and minuses; it works for some and not for others. Support for soldiers may carry some conditions. And so on.

If any of these ruffle your feathers, then good. The first step towards thinking in 3D is realizing that you carry many of your cherished positions too strongly. Most of practical reality lies outside the realm of mathematical certainty.

Lyndon Johnson

There’s a wonderful series of books on Lyndon Johnson, the 36th President of the United States. By all accounts, LBJ was not someone you’d like to marry into your family. He was a relentless politician, a climber, a habitual liar, and treated many people like dirt, including his wife Lady Bird. He also embroiled the country in Vietnam, for which many never forgave him.

On the other hand, LBJ was a deep Southerner who cared deeply about the rights of the poor and the rights of people of color, at a time when few whites did, and even fewer whites in power did. He used his political power to enact Civil Rights legislation that seemingly no one else could get through, and with his Great Society programs, gave millions of poor and elderly people dignity, both of which we basically take for granted today, but were an enormous struggle to enact.

LBJ was not popular in his time, though history has been a bit more friendly to him. But the question stands…was he a good guy? Do we admire him or can we barely contain our hatred?

To an ideologue, LBJ fits into some category or another. He’s despicable, and his crimes cannot be made up for. His lies and his personal reputation make him unforgivable. Alternatively, by passing Civil Rights, maybe LBJ is something of a dark hero — a flawed, Batman-like figure who we needed but couldn’t appreciate in his time.

The truth is, of course, in between. He’s all of these things. The problem lies with us, the categorizers. We want to place him somewhere and move on. You may fairly, on balance, think LBJ detracted more than he added. That’s fine. But that’s not what most people want to do — they want to put the black hat or the white hat on him. Villain or hero.

This is a special case of a broader mental phenomenon that we’re doing all the time. This music sucks! This music is the best thing ever created! Yoga is for weirdos. Yoga is the only way to achieve mental peace. 

It’s only once you can begin divorcing yourself from good-and-bad, black-and-white, category X&Y type thinking that your understanding of reality starts to fit together properly. Putting things on a continuum, assessing the scale of their importance and quantifying their effects, understanding both the good and the bad, is the way to do it.  Understanding the other side of the argument better than your own, a theme we hammer on ad nauseum, is the way to do it. Because truth always lies somewhere in between, and the discomfort of being uncertain is preferable to the certainty of being wrong.

It isn’t easy, but it’s not supposed to be.

Architect Matthew Frederick on the Three Levels of Knowing

Three Levels of Knowing

Architect Matthew Frederick draws our attention to the three levels of knowing in 101 Things Things I Learned in Architecture School.

Simplicity is the world view of the child or uninformed adult, fully engaged in his own experience and happily unaware of what lies beneath the surface of immediate reality.

Complexity characterizes the ordinary adult world view. It is characterized by an awareness of complex systems in nature and society but an inability to discern clarifying patterns and connections.

Informed Simplicity is an enlightened view of reality. It is founded on ability to discern or create clarifying patterns with complex mixtures. Pattern recognition is a crucial skill for an architect, who must create a highly ordered building amid many competing and frequently nebulous design considerations.

One approach to informed simplicity is a narrow specialization. By immersing yourself in one discipline or field, you can often begin to see things at an informed simplicity level. That is, you understand the variables at play, the probable results, what’s important and what’s not, etc.

Farnam Street takes another approach.

We’re trying to better understand how the world works so we can align ourselves with reality. We become the generalist, with a few big ideas from each discipline that we can combine to understand the forces at play.

However, we can only take you so far. Part of seeing things with informed simplicity means that you’ve done the work and chewed on the complexity yourself. If we gave you the answers – not that we have them – you’d never have them when you need them because you wouldn’t understand why they work, when they work and when they don’t work. You have to synthesize for yourself.

At the 2016 Daily Journal Meeting, Charlie Munger commented on this:

Saying you’re in favor of synthesis is like saying you’re in favor of reality. Synthesis is reality because we live in a world with multiple factors involved. Of course, you’ve got to have synthesis to understand the situation when two factors are intertwined. Of course, you want to be good at synthesis.

It’s easy to say you want to be good at synthesis. But it’s not what the reward system of the world pays for. They want extreme specialization. By the way, for most people extreme specialization is the way to succeed. Most people are way better off being a chiropodist than trying to understand a little bit of all the disciplines.

Eager to Be Wrong

“You know what Kipling said? Treat those two impostors just the same — success and failure. Of course, there’s going to be some failure in making the correct decisions. Nobody bats a thousand. I think it’s important to review your past stupidities so you are less likely to repeat them, but I’m not gnashing my teeth over it or suffering or enduring it. I regard it as perfectly normal to fail and make bad decisions. I think the tragedy in life is to be so timid that you don’t play hard enough so you have some reverses.”
— Charlie Munger

***

When was the last time you said to yourself I hope I’m wrong and really meant it?

Have you ever really meant it?

Here’s the thing: In our search for truth we must realize, thinking along two tracks, that we’re frequently led to wrong solutions by the workings of our natural apparatus. Uncertainty is a very mentally demanding, and in a certain way, physically demanding process. The brain uses a lot of energy when it has to process conflicting information. To show yourself, try reading up on something contentious like the abortion debate, but with a completely open mind to either side (if you can). Pay attention as your brain starts twisting itself into a very uncomfortable state while you explore completely opposing sides of an argument.

This mental pain is called cognitive dissonance and it’s really not that much fun. Charlie Munger calls the process of resolving this dissonance doubt avoidance tendency – the tendency to resolve conflicting information as quickly as possible to return to physical and mental comfort. To get back to your happy zone.

Combine this tendency to resolve doubt with the well-known first conclusion bias (something Francis Bacon knew about long ago), and the logical conclusion is that we land on a lot of wrong answers and stay there because it’s easier.

Let that sink in. We don’t stay there because we’re correct, but because it’s physically easier. It’s a form of laziness.

Don’t believe me? Spend a single day asking yourself this simple question: Do I know this for sure, or have I simply landed on a comfortable spot?

You’ll be surprised how many things you do and believe just because it’s easy. You might not even know how you landed there. Don’t feel bad about it — it’s as natural as breathing. You were wired that way at birth.

But there is a way to attack this problem.

Munger has a dictum that he won’t allow himself to hold an opinion unless he knows the other side of the argument better than that side does. Such an unforgiving approach means that he’s not often wrong. (It sometimes takes many years to show, but posterity has rarely shown him to be way off.) It’s a tough, wise, and correct solution.

It’s still hard though, and doesn’t solve the energy expenditure problem. What can we tell ourselves to encourage ourselves to do that kind of work? The answer would be well-known to Darwin: Train yourself to be eager to be wrong.

Right to be Wrong

The advice isn’t simply to be open to being wrong, which you’ve probably been told to do your whole life. That’s nice, and correct in theory, but frequently turns into empty words on a page. Simply being open to being wrong allows you to keep the window cracked when confronted with disconfirming evidence — to say Well, I was open to it! and keep on with your old conclusion.

Eagerness implies something more. Eager implies that you actively hope there is real, true, disconfirming information proving you wrong. It implies you’d be more than glad to find it. It implies that you might even go looking for it. And most importantly, it implies that when you do find yourself in error, you don’t need to feel bad about it. You feel great about it! Imagine how much of the world this unlocks for you.

Why be so eager to prove yourself wrong? Well, do you want to be comfortable or find the truth? Do you want to say you understand the world or do you want to actually understand it? If you’re a truth seeker, you want reality the way it is, so you can live in harmony with it.

Feynman wanted reality. Darwin wanted reality. Einstein wanted reality. Even when they didn’t like it. The way to stand on the shoulders of giants is to start the day by telling yourself I can’t wait to correct my bad ideas, because then I’ll be one step closer to reality. 

*** 

Post-script: Make sure you apply this advice to things that matter. As stated above, resolving uncertainty takes great energy. Don’t waste that energy on deciding whether Nike or Reebok sneakers are better. They’re both fine. Pick the ones that feel comfortable and move on. Save your deep introspection for the stuff that matters.