Tag: Charlie Munger

Arguments Are For Learning, Not Winning

Despite his best efforts and long hours, Nobel-Prize winning physicist and professor Carl Wieman grew frustrated by his inability to teach and his students’ failure to learn.

When I first taught physics as a young assistant professor, I used the approach that is all too common when someone is called upon to teach something. First I thought very hard about the topic and got it clear in my own mind. Then I explained it to my students so that they would understand it with the same clarity I had.

At least that was the theory. But I am a devout believer in the experimental method, so I always measure results. And whenever I made any serious attempt to determine what my students were learning, it was clear that this approach just didn’t work. An occasional student here and there might have understood my beautifully clear and clever explanations, but the vast majority of students weren’t getting them at all.

In a traditional classroom, the teacher stands at the front of the class explaining what is clear in their mind to a group of passive students.

Yet this pedagogical strategy doesn’t positively impact retention of information from lecture, improve understanding of basic concepts, or affect beliefs (that is, does new information change your belief about how something works).

Alison Gopnik, says “I don’t think there’s any scientist who thinks the way we typically do university courses has anything to do with the best methods for getting people to learn. ”

Given that lectures were devised as a means of transferring knowledge from one to many, it seems obvious that we would ensure that people retain the information they are consuming.

Wieman mentions three studies, the last of which perfectly emphasizes the disturbing point that passive lectures do not seem to work.

In a final example, a number of times Kathy Perkins and I have presented some non-obvious fact in a lecture along with an illustration, and then quizzed the students 15 minutes later on the fact. About 10 percent usually remember it by then. To see whether we simply had mentally deficient students, I once repeated this experiment when I was giving a departmental colloquium at one of the leading physics departments in the United States. The audience was made up of physics faculty members and graduate students, but the result was about the same—around 10 percent.

Wieman argues these results are likely generic and make a lot of sense if you consider the extremely limited capacity of short-term memory.

The research tells us that the human brain can hold a maximum of about seven different items in its short-term working memory and can process no more than about four ideas at once. Exactly what an “item” means when translated from the cognitive science lab into the classroom is a bit fuzzy. But the number of new items that students are expected to remember and process in the typical hour-long science lecture is vastly greater.

The results were similarly disturbing when students were tested to determine understanding of basic concepts. More instruction wasn’t helping students advance from novice to expert. In fact, the data indicated the opposite: students had more novice-like beliefs after they completed a course than they had when they started.

We’re left with a puzzle about teaching. The teachers, unquestionably experts in their subjects, are not improving the learning outcomes: students are not learning the concepts. How can this be?

Research on learning provides some answers.

Cognitive scientists have spent a lot of time studying what constitutes expert competence in any discipline, and they have found a few basic components. The first is that experts have lots of factual knowledge about their subject, which is hardly a surprise. But in addition, experts have a mental organizational structure that facilitates the retrieval and effective application of their knowledge. Third, experts have an ability to monitor their own thinking (“metacognition”), at least in their discipline of expertise. They are able to ask themselves, “Do I understand this? How can I check my understanding?”

A traditional science instructor concentrates on teaching factual knowledge, with the implicit assumption that expert-like ways of thinking about the subject come along for free or are already present. But that is not what cognitive science tells us. It tells us instead that students need to develop these different ways of thinking by means of extended, focused mental effort. Also, new ways of thinking are always built on the prior thinking of the individual, so if the educational process is to be successful, it is essential to take that prior thinking into account.

This is basic biology. Everything that constitutes “understanding” science and “thinking scientifically” resides in the long-term memory, which is developed via the construction and assembly of component proteins. So a person who does not go through this extended mental construction process simply cannot achieve mastery of a subject.

This reminds me a lot of what Charlie Munger said on mental models:

What is elementary, worldly wisdom? Well, the first rule is that you can’t really know anything if you just remember isolated facts and try and bang ‘em back. If the facts don’t hang together on a latticework of theory, you don’t have them in a usable form.

You’ve got to have models in your head. And you’ve got to array your experience both vicarious and direct on this latticework of models. You may have noticed students who just try to remember and pound back what is remembered. Well, they fail in school and in life. You’ve got to hang experience on a latticework of models in your head.

What are the models? Well, the first rule is that you’ve got to have multiple models because if you just have one or two that you’re using, the nature of human psychology is such that you’ll torture reality so that it fits your models, or at least you’ll think it does…

It’s like the old saying, ”To the man with only a hammer, every problem looks like a nail.”

Students are not learning the basic concepts that experts rely on to organize and apply information. And they are not being aided in developing the mental framework – the latticework – they need to improve retrieval and application of knowledge. “So it makes perfect sense,” Wieman writes “that they are not learning to think like experts, even though they are passing science courses by memorizing facts and problem-solving recipes.”

Improved teaching and learning

A lot of educational and cognitive research can be reduced to this basic principle: People learn by creating their own understanding. But that does not mean they must or even can do it without assistance. Effective teaching facilitates that creation by getting students engaged in thinking deeply about the subject at an appropriate level and then monitoring that thinking and guiding it to be more expert-like.

So what are a few examples of these strategies, and how do they reflect our increasing understanding of cognition?

Reducing Cognitive Load

The first way in which one can use research on learning to create better classroom practices addresses the limited capacity of the short-term working memory. Anything one can do to reduce cognitive load improves learning. The effective teacher recognizes that giving the students material to master is the mental equivalent of giving them packages to carry. With only one package, they can make a lot of progress in a hurry. If they are loaded down with many, they stagger around, have a lot more trouble, and can’t get as far. And when they experience the mental equivalent of many packages dumped on them at once, they are squashed flat and can’t learn anything.

So anything the teacher can do to reduce that cognitive load while presenting the material will help. Some ways to do so are obvious, such as slowing down. Others include having a clear, logical, explicit organization to the class (including making connections between different ideas presented and connections to things the students already know), using figures where appropriate rather than relying only on verbal descriptions and minimizing the use of technical jargon. All these things reduce unnecessary cognitive demands and result in more learning.

Addressing Beliefs

A second way teachers can improve instruction is by recognizing the importance of student beliefs about science. This is an area my own group studies. We see that the novice/expert-like beliefs are important in a variety of ways—for example they correlate with content learning and choice of major. However, our particular interest is how teaching practices affect student beliefs. Although this is a new area of research, we find that with rather minimal interventions, a teacher can avoid the regression mentioned above.

The particular intervention we have tried addresses student beliefs by explicitly discussing, for each topic covered, why this topic is worth learning, how it operates in the real world, why it makes sense, and how it connects to things the student already knows. Doing little more than this eliminates the usual significant decline and sometimes results in small improvements, as measured by our surveys. This intervention also improves student interest, because the beliefs measured are closely linked to that interest.

Stimulating and Guiding Thinking

My third example of how teaching and learning can be improved is by implementing the principle that effective teaching consists of engaging students, monitoring their thinking, and providing feedback. Given the reality that student-faculty interaction at most colleges and universities is going to be dominated by time together in the classroom, this means the teacher must make this happen first and foremost in the classroom.

To do this effectively, teachers must first know where the students are starting from in their thinking, so they can build on that foundation. Then they must find activities that ensure that the students actively think about and process the important ideas of the discipline. Finally, instructors must have mechanisms by which they can probe and then guide that thinking on an ongoing basis. This takes much more than just mastery of the topic—it requires, in the memorable words of Lee Shulman, “pedagogical content knowledge.”

Arguments Are For Learning, Not Winning

Is arguing the path towards learning?

I assign students to groups the first day of class (typically three to four students in adjacent seats) and design each lecture around a series of seven to 10 clicker questions that cover the key learning goals for that day. The groups are told they must come to a consensus answer (entered with their clickers) and be prepared to offer reasons for their choice. It is in these peer discussions that most students do the primary processing of the new ideas and problem-solving approaches. The process of critiquing each other’s ideas in order to arrive at a consensus also enormously improves both their ability to carry on scientific discourse and to test their own understanding.

An Ancient Lesson on Taking Responsibility For Decisions

“A decision is responsible,” wrote Charles Frankel, “when the man or group that makes it has to answer for it to those who are directly or indirectly affected by it.”

Think about that for a second.

How often does that happen today? Not very often.

In most organizations people don’t make decisions — committees do. Responsibility is diffused to a group, not the individual. Everyone is insulated from their mistakes. Everyone takes credit for success.

The ancients had a way around this. Consider Hammurabi’s Code:

If a builder builds a house for a man and does not make its construction firm, and the house which he has built collapses and causes the death of the owner of the house, that builder shall be put to death.

While extreme, that is the best risk-management rule ever. If you have the upside, you have to keep the downside.

The Roman System

The Romans had a similar system.

The guy who created the arch stood under it as the scaffolding was removed. And to some extent, we do the same thing today. No one packs your parachute for you.

Charlie Munger, the partner of Warren Buffett at Berkshire Hathaway, puts it another way:

Another thing that is never discussed any more is my idea of one of the great philosophers of America who was Charlie Frankel. He was mugged to death in due course because, after all, he lived in Manhattan in a different time. Before he was mugged to death, he created this philosophy of responsibility. He said the system is responsible in proportion to the degree that the people who make the decisions bear the consequences.

So to Charlie Frankel, you don’t create a loan system where all the people who make the loans promptly dump them on somebody else through lies and twaddle, and they don’t bear the responsibility when the loans are good or bad. To Frankel, that is amoral, that is an irresponsible system. That is like selling an automobile with bad brakes and you know the brakes are bad. You shouldn’t do it.

We’ve gotten away from responsibility for our decisions, which allows people to get all the upside and none of the downside. Is it any wonder why things go wrong?

How can we implement better decisions in organizations? Make people stand under their own arches. One effective way to implement this is to make the person responsible for a decision sign their name to the decision. Simple and effective but not easy to implement.

Insensitivity To Base Rates: An Introduction

Our insensitivity to base rates emanates from the representativeness heuristic and is a common psychological bias.

From Smart Choices: A Practical Guide to Making Better Decisions:

Donald Jones is either a librarian or a salesman. His personality can best be described as retiring. What are the odds that he is a librarian?

When we use this little problem in seminars, the typical response goes something like this: “Oh, it’s pretty clear that he’s a librarian. It’s much more likely that a librarian will be retiring; salesmen usually have outgoing personalities. The odds that he’s a librarian must be at least 90 percent.” Sounds good, but it’s totally wrong.

The trouble with this logic is that it neglects to consider that there are far more salesmen than male librarians. In fact, in the United States, salesmen outnumber male librarians 100 to 1. Before you even considered the fact that Donald Jones is “retiring,” therefore, you should have assigned only a 1 percent chance that Jones is a librarian. That is the base rate.

Now, consider the characteristic “retiring.” Suppose half of all male librarians are retiring, whereas only 5 percent of salesmen are. That works out to 10 retiring salesmen for every retiring librarian — making the odds that Jones is a librarian closer to 10 percent than to 90 percent. Ignoring the base rate can lead you wildly astray.

* * *

Charlie Munger, instructs us how to think about base rates with an example of an employee who got caught for stealing, claiming she’s never done it before and will never do it again:

You find an isolated example of a little old lady in the See’s Candy Company, one of our subsidiaries, getting into the till. And what does she say? “I never did it before, I’ll never do it again. This is going to ruin my life. Please help me.” And you know her children and her friends, and she’d been around 30 years and standing behind the candy counter with swollen ankles. When you’re an old lady it isn’t that glorious a life. And you’re rich and powerful and there she is: “I never did it before, I’ll never do it again.” Well how likely is it that she never did it before? If you’re going to catch 10 embezzlements a year, what are the chances that any one of them — applying what Tversky and Kahneman called base rate information — will be somebody who only did it this once? And the people who have done it before and are going to do it again, what are they all going to say? Well in the history of the See’s Candy Company they always say, “I never did it before, and I’m never going to do it again.” And we cashier them. It would be evil not to, because terrible behavior spreads (Greshams law).

* * *

Max Bazerman, in Judgment in Managerial Decision Making, writes:

(Our tendency to ignore base rates) is even stronger when the specific information is vivid and compelling, as Kahneman and Tversky illustrated in one study from 1972. Participants were given a brief description of a person who enjoyed puzzles and was both mathematically inclined and introverted. Some participants were told that this description was selected from a set of seventy engineers and thirty lawyers. Others were told that the description came from a list of thirty engineers and seventy lawyers. Next, participants were asked to estimate the probability that the person described was an engineer. Even though people admitted that the brief description did not offer a foolproof means of distinguishing lawyers from engineers, most tended to believe the description was of an engineer. Their assessments were relatively impervious to differences in base rates of engineers (70 percent versus 30 percent of the sample group.)

Participants do use base-rate data correctly when no other information is provided. In the absence of a personal description, people use the base rates sensibly and believe that a person picked at random from a group made up mostly of lawyers is most likely to be a lawyer. Thus, people understand the relevance of base-rate information, but tend to disregard such data when individuating data are also available.

Ignoring base rates has many unfortunate implications. … Similarly, unnecessary emotional distress is caused in the divorce process because of the failure of couples to create prenuptial agreements that facilitate the peaceful resolution of a marriage. The suggestion of a prenuptial agreement is often viewed as a sign of bad faith. However, in far too many cases, the failure to create prenuptial agreements occurs when individuals approach marriage with the false belief that the high base rate for divorce does not apply to them.

* * *

Of course, this applies to investing as well. This conversation with Sanjay Bakshi speaks to this:

One of the great lessons from studying history is to do with “base rates”. “Base rate” is a technical term of describing odds in terms of prior probabilities. The base rate of having a drunken-driving accident is higher than those of having accidents in a sober state.

So, what’s the base rate of investing in IPOs? When you buy a stock in an IPO, and if you flip it, you make money if it’s a hot IPO. If it’s not a hot IPO, you lose money. But what’s the base rate – the averaged out experience – the prior probability of the activity of subscribing for IPOs – in the long run?

If you do that calculation, you’ll find that the base rate of IPO investing (in fact, it’s not even investing … it’s speculating) sucks! [T]hat’s the case, not just in India, but in every market, in different time periods.

[…]

When you evaluate whether smoking is good for you or not, if you look at the average experience of 1,000 smokers and compare them with a 1,000 non-smokers, you’ll see what happens.

People don’t do that. They get influenced by individual stories like a smoker who lived till he was 95. Such a smoker will force many people to ignore base rates, and to focus on his story, to fool themselves into believing that smoking can’t be all that bad for them.

What is the base rate of investing in leveraged companies in bull markets?

[…]

This is what you learn by studying history. You know that the base rate of investing in an airline business sucks. There’s this famous joke about how to become a millionaire. You start with a billion, and then you buy an airline. That applies very well in this business. It applies in so many other businesses.

Take the paper industry as an example. Averaged out returns on capital for paper industry are bad for pretty good reasons. You are selling a commodity. It’s an extremely capital intensive business. There’s a lot of over-capacity. And if you understand microeconomics, you really are a price taker. There’s no pricing power for you. Extreme competition in such an environment is going to cause your returns on capital to be below what you would want to have.

It’s not hard to figure this out (although I took a while to figure it out myself). Look at the track record of paper companies around the world, and the airline companies around the world, or the IPOs around the world, or the textile companies around the world. Sure, there’ll be exceptions. But we need to focus on the average experience and not the exceptional ones. The metaphor I like to use here is that of a pond. You are the fisherman. If you want to catch a lot of fish, then you must go to a pond where there’s a lot of fish. You don’t want to go to fish in a pond where there’s very little fish. You may be a great fisherman, but unless you go to a pond where there’s a lot of fish, you are not going to find a lot of fish.

[…]

So one of the great lessons from studying history is to see what has really worked well and what has turned out to be a disaster – and to learn from both.

***

Bias from Insensitivity To Base Rates is part of the Farnam Street Latticework of Mental Models.

Charlie Munger: How to Teach Business School

From Charlie Munger at the 2011 Berkshire Hathaway Shareholders Meeting:

Costco of course is a business that became the best in the world in its category. And it did it with an extreme meritocracy, and an extreme ethical duty—self-imposed to take all its cost advantages as fast as it could accumulate them and pass them on to the customers. And of course they’ve created ferocious customer loyalty. It’s been a wonderful business to watch—and of course strange things happen when you do that and when you do that long enough. Costco has one store in Korea that will do over $400 million in sales this year. These are figures that can’t exist in retail, but of course they do. So that’s an example of somebody having the right managerial system, the right personnel solution, the right ethics, the right diligence, etcetera, etcetera. And that is quite rare. If once or twice in your lifetime you’re associated with such a business you’re a very lucky person.

The more normal business is a business like, say, General Motors, which became the most successful business of its kind in the world and wiped out its common shareholders… what, last year? That is a very interesting story—and if I were teaching business school I would have Value-Line-type figures that took me through the entire history of General Motors and I would try to relate the changes in the graph and data to what happened in the business. To some extent, they faced a really difficult problem—heavily unionized business, combined with great success, and very tough competitors that came up from Asia and elsewhere in Europe. That is a real problem which of course… to prevent wealth from killing people—your success turning into a disadvantage—is a big problem in business.

And so there are all these wonderful lessons in those graphs. I don’t know why people don’t do it. The graphs don’t even exist that I would use to teach. I can’t imagine anybody being dumb enough not to have the kind of graphs I yearn for. [Laughter] But so far as I know there’s no business school in the country that’s yearning for these graphs. Partly the reason they don’t want it is if you taught a history of business this way, you’d be trampling on the territories of all the professors and sub-disciplines—you’d be stealing some of their best cases. And in bureaucracies, even academic bureaucracies, people protect their own turf. And of course a lot of that happened at General Motors. [Applause]

I really think the world … that’s the way it should be taught. Harvard Business School once taught it much that way—and they stopped. And I’d like to make a case study as to why they stopped. [Laughter] I think I can successfully guess. It’s that the course of history of business trampled on the territory of barons of other disciplines like the baron of marketing, the baron of finance, the baron of whatever.

IBM is an interesting case. There’s just one after another that are just utterly fascinating. I don’t think they’re properly taught at all because nobody wants to do the full sweep.

Source

Charlie Munger: Bad Morals Drive Out The Good

Charlie Munger on applying Gresham’s Law.

The idiotic ideas are all from the social science department and I would put economics in the social sciences department although it has some tinges of reality that remind you of arts and science.

In economics textbooks they teach you Gresham’s Law: Bad money drives out good. But we don’t have any bad money that amounts to anything. We don’t have any coins that are worth a lot, that have precious metals that you can melt down. Nobody cares what the melt-down value of the quarter is in relationship to the dime, so Gresham’s Law is a non-starter in the modern world. Bad money drives out good. But the new form of Gresham’s Law is ungodly important. The new form of Gresham’s Law is brought into play – in economic thought, anyway – in the savings and loans crisis, when it was perfectly obvious that bad lending drives out good. Think of how powerful that model is. Think of the disaster that it creates for everybody. You sit there in your little institution. All of the builders [are not good credits anymore], and you are in the business of lending money to builders. Unless you do the same idiotic thing [as] Joe Blow is doing down the street. Pete Johnson up the street wants to do something a little dumber and the thing just goes to a mighty tide. You’ve got to shrink the business that you love and maybe lay off the employees who have trusted you their careers and so forth or [make] a lot of dumb loans. At Berkshire Hathaway we try and let the place shrink. We never fire anybody, we tell them to go out and play golf. We sure as hell don’t want to make any dumb loans. But that is very hard to do if you sit in a leadership position in society with people you helped recruit, you meet their wives and children and so forth. The bad loans drive out the good.

It isn’t just bad loans. Bad morals drive out the good. If you want to run a check-cashing agency in [a] downtown big city, more than 100 percent of all the profit you could possibly earn can only be earned by flim-flamming people on the finance contracts. So if you aren’t willing to cheat people – basically minorities – more than 100 percent of the profit can’t be earned. Well, if you inherited the business or your idiot son-in-law is in it, you don’t know what else to do. This is what I would call an adult problem and most people solve it in the adult fashion: They learn to tolerate the cheating. But that is not the right answer to people who want to live a larger and better life. But it is a form of Greshem’s Law, the new Gresham’s Law. One that is not taught in economics courses and should be. It is a really serious problem and, of course, it relates deeply to what happened to create the economic crisis. All kinds of people who you would be glad to have marry into your family compared to what you are otherwise going to get did things that were very regrettable under these pressures from the new Gresham’s Law.

Jeff Bezos on Why People that Are Often Right Change Their Minds Often

Jeff Bezos recently stopped by the office of 37 Signals. After talking product strategy he answered some questions.

In his answer to one question he shared some thoughts on people who were “right a lot.”

He said people who were right a lot of the time were people who often changed their minds. He doesn’t think consistency of thought is a particularly positive trait. It’s perfectly healthy — encouraged, even — to have an idea tomorrow that contradicted your idea today.

He’s observed that the smartest people are constantly revising their understanding, reconsidering a problem they thought they’d already solved. They’re open to new points of view, new information, new ideas, contradictions, and challenges to their own way of thinking.

This doesn’t mean you shouldn’t have a well formed point of view, but it means you should consider your point of view as temporary.

What trait signified someone who was wrong a lot of the time? Someone obsessed with details that only support one point of view. If someone can’t climb out of the details, and see the bigger picture from multiple angles, they’re often wrong most of the time.

Bezos isn’t alone. Warren Buffett’s long time business partner Charlie Munger captures this:

If Berkshire has made modest progress, a good deal of it is because Warren and I are very good at destroying our own best-loved ideas. Any year that you don’t destroy one of your best-loved ideas is probably a wasted year.

John Kenneth Galbraith put it this way:

Faced with the choice between changing one’s mind and proving there is no need to do so, almost everyone gets busy on the proof.

If you liked this, you’ll love:

How to Change How We Think — In the end changing how we think — that is our thought patterns — becomes about changing the language we use for internal and external communication.

Multitasking: Giving the World an Advantage it Shouldn’t Have — “I think when you multi-task so much, you don’t have time to think about anything deeply. You’re giving the world an advantage you shouldn’t do. Practically everybody is drifting into that mistake.”