Category: Uncategorized

Using Models to Stay Calm in Charged Situations

When polarizing topics are discussed in meetings, passions can run high and cloud our judgment. Learn how mental models can help you see clearly from this real-life scenario.

***

Mental models can sometimes come off as an abstract concept. They are, however, actual tools you can use to navigate through challenging or confusing situations. In this article, we are going to apply our mental models to a common situation: a meeting with conflict.

A recent meeting with the school gave us an opportunity to use our latticework. Anyone with school-age kids has dealt with the bureaucracy of a school system and the other parents who interact with it. Call it what you will, all school environments usually have some formal interface between parents and the school administration that is aimed at progressing issues and ideas of importance to the school community.

The particular meeting was an intense one. At issue was the school’s communication around a potentially harmful leak in the heating system. Some parents felt the school had communicated reasonably about the problem and the potential consequences. Others felt their child’s life had been put in danger due to potential exposure to mold and asbestos. Some parents felt the school could have done a better job of soliciting feedback from students about their experiences during the previous week, and others felt the school administration had done a poor job about communicating potential risks to parents.

The first thing you’ll notice if you’re in a meeting like this is that emotions on all sides run high. After some discussion you might also notice a few more things, like how many people do the following:

Any of these occurrences, when you hear them via statements from people around the table, are a great indication that using a few mental models might improve the dynamics of the situation.

The first mental model that is invaluable in situations like this is Hanlon’s Razor: don’t attribute to maliciousness that which is more easily explained by incompetence. (Hanlon’s Razor is one of the 9 general thinking concepts in The Great Mental Models Volume One.) When people feel victimized, they can get angry and lash out in an attempt to fight back against a perceived threat. When people feel accused of serious wrongdoing, they can get defensive and withhold information to protect themselves. Neither of these reactions is useful in a situation like this. Yes, sometimes people intentionally do bad things. But more often than not, bad things are the result of incompetence. In a school meeting situation, it’s safe to assume everyone at the table has the best interests of the students at heart. School staff and administrators usually go into teaching motivated by a deep love of education. They genuinely want their schools to be amazing places of learning, and they devote time and attention to improving the lives of their students.

It makes no sense to assume a school’s administration would deliberately withhold harmful information. Yes, it could happen. But, in either case, you are going to obtain more valuable information if you assume poor decisions were the result of incompetence versus maliciousness.

When we feel people are malicious toward us, we instinctively become a negatively coiled spring, waiting for the right moment to take them down a notch or two. Removing malice from the equation, you give yourself emotional breathing room to work toward better solutions and apply more models.

The next helpful model is relativity, adapted from the laws of physics. This model is about remembering that everyone’s perspective is different from yours. Understanding how others see the same situation can help you move toward a more meaningful dialogue with the people in the meeting. You can do this by looking around the room and asking yourself what is influencing people’s approaches to the situation.

In our school meeting, we see some people are afraid for their child’s health. Others are influenced by past dealings with the school administration. Authorities are worried about closing the school. Teachers are concerned about how missed time might impact their students’ learning. Administrators are trying to balance the needs of parents with their responsibility to follow the necessary procedures. Some parents are stressed because they don’t have care for their children when the school closes. There is a lot going on, and relativity gives us a lens to try to identify the dynamics impacting communication.

After understanding the different perspectives, it becomes easier to incorporate them into your thinking. You can diffuse conflict by identifying what it is you think you hear. Often, just the feeling of being heard will help people start to listen and engage more objectively.

Now you can dive into some of the details. First up is probabilistic thinking. Before we worry about mold levels or sick children, let’s try to identify the base rates. What is the mold content in the air outside? How many children are typically absent due to sickness at this time of year? Reminding people that severity has to be evaluated against something in a situation like this can really help diffuse stress and concern. If 10% of the student population is absent on any given day, and in the week leading up to these events 12% to 13% of the population was absent, then it turns out we are not actually dealing with a huge statistical anomaly.

Then you can evaluate the anecdotes with the model of the Law of Large Numbers in mind. Small sample sizes can be misleading. The larger your group for evaluation, the more relevant the conclusions. In a situation such as our school council meeting, small sample sizes only serve to ratchet up the emotion by implying they are the causal outcomes of recent events.

In reality, any one-off occurrence can often be explained in multiple ways. One or two children coming home with hives? There are a dozen reasonable explanations for that: allergies, dry skin, reaction to skin cream, symptom of an illness unrelated to the school environment, and so on. However, the more children that develop hives, the more it is statistically possible the cause relates to the only common denominator between all children: the school environment.

Even then, correlation does not equal causation. It might not be a recent leaky steam pipe; is it exam time? Are there other stressors in the culture? Other contaminants in the environment? The larger your sample size, the more likely you will obtain relevant information.

Finally, you can practice systems thinking and contribute to the discussion by identifying the other components in the system you are all dealing with. After all, a school council is just one part of a much larger system involving governments, school boards, legislators, administrators, teachers, students, parents, and the community. When you put your meeting into the bigger context of the entire system, you can identify the feedback loops: Who is responding to what information, and how quickly does their behavior change? When you do this, you can start to suggest some possible steps and solutions to remedy the situation and improve interactions going forward.

How is the information flowing? How fast does it move? How much time does each recipient have to adjust before receiving more information? Chances are, you aren’t going to know all this at the meeting. So you can ask questions. Does the principal have to get approval from the school board before sending out communications involving risk to students? Can teachers communicate directly with parents? What are the conditions for communicating possible risk? Will speculation increase the speed of a self-reinforcing feedback loop causing panic? What do parents need to know to make an informed decision about the welfare of their child? What does the school need to know to make an informed decision about the welfare of their students?

In meetings like the one described here, there is no doubt that communication is important. Using the meeting to discuss and debate ways of improving communication so that outcomes are generally better in the future is a valuable use of time.

A school meeting is one practical example of how having a latticework of mental models can be useful. Using mental models can help you diffuse some of the emotions that create an unproductive dynamic. They can also help you bring forward valuable, relevant information to assist the different parties in improving their decision-making process going forward.

At the very least, you will walk away from the meeting with a much better understanding of how the world works, and you will have gained some strategies you can implement in the future to leverage this knowledge instead of fighting against it.

Finite and Infinite Games: Two Ways to Play the Game of Life

If life is a game, how do you play it? The answer will have a huge impact on your choices, your satisfaction, and how you achieve success.

***

James Carse, the Director of Religious Studies at New York University, wrote a book, Finite and Infinite Games, that explores the difference between approaching life as a game with an end, or a game that goes on forever. According to Carse, playing to win isn’t nearly as satisfying as playing to keep the game going.

For starters, what do you do after you win a finite game? You have to sign yourself up for another one, and you must find a way to showcase your past winnings. Finite players have to parade around their wealth and status. They need to display the markers of winning they have accumulated so that other players know whom they are dealing with. Carse argues that these players spend their time in the past, because that’s where their winning is.

Infinite players, in contrast, look to the future. Because their goal is to keep the game going, they focus less on what happened, and put more effort into figuring out what’s possible. By playing a single, non-repeatable game, they are unconcerned with the maintenance and display of past status. They are more concerned with positioning themselves to deal effectively with whatever challenges come up.

Thus, how you play the game of life will define the learning you pursue. Finite players need training. Infinite players need education. Why? According to Carse, “to be prepared against surprise is to be trained. To be prepared for surprise is to be educated.” If you play life as a finite game, you train for the rules. If life is instead an infinite game, you focus on being educated to adapt to unknowns.

“What will undo any boundary is the awareness that it is our vision, and not what we are viewing, that is limited.”

Whether you choose the finite or infinite game will also determine how you define success, and what you need to achieve it. Finite players need power. Power gives them the best chance to win in each successive contest. Infinite players need endurance. They need attributes to keep them going. Carse explains, “let us say that where the finite player plays to be powerful, the infinite player plays with strength.”

Ultimately, approaching life as a finite game or infinite game impacts your daily attitude. Carse asserts that “the finite play for life is serious; the infinite play of life joyous.” Considering your life through this frame helps you determine if you are making the right choices to be successful at the kind of game you want to play.

Scarcity: Why Having Too Little Means So Much

“The biggest mistake we make about scarcity is we view it as a physical phenomenon. It’s not.”

We’re busier than ever. The typical inbox is perpetually swelling with messages awaiting attention. Meetings need to be rescheduled because something came up. Our relationships suffer. We don’t spend as much time as we should with those who mean something to us. We have little time for new people; potential friends eventually get the hint and stop proposing ideas for things to do together. Falling behind, turns into a vicious cycle.

Does this sound anything like your life?

You have something in common with people who fall behind on their bills, argue Harvard economist Sendhil Mullainathan and Princeton psychologist Eldar Shafir in their book Scarcity: Why Having Too Little Means So Much. The resemblance, they write, is clear.

Missed deadlines are a lot like over-due bills. Double-booked meetings (committing time you do not have) are a lot like bounced checks (spending money you do not have). The busier you are, the greater the need to say no. The more indebted you are, the greater the need to not buy. Plans to escape sound reasonable but prove hard to implement. They require constant vigilance—about what to buy or what to agree to do. When vigilance flags—the slightest temptation in time or in money—you sink deeper.

Some people end up sinking further into debt. Others with more commitments. The resemblance is striking.

We normally think of time management and money management as distinct problems. The consequences of failing are different: bad time management leads to embarrassment or poor job performance; bad money management leads to fees or eviction. The cultural contexts are different: falling behind and missing a deadline means one thing to a busy professional; falling behind and missing a debt payment means something else to an urban low-wage worker.

What’s common between these situations? Scarcity. “By scarcity,” they write, “we mean having less than you feel you need.”

And what happens when we feel a sense of scarcity? To show us Mullainathan and Shafir bring us back to the past. Near the end of World War II, the Allies realized they would need to feed a lot of Europeans on the edge of starvation. The question wasn’t where to get the food but, rather, something more technical. What is the best way to start feeding them? Should you begin with normal meals or small quantities that gradually increase? Researchers at the University of Minnesota undertook an experiment with healthy male volunteers in a controlled environment “where their calories were reduced until they were subsisting on just enough food so as not to permanently harm themselves.” The most surprising findings were psychological. The men became completely focused on food in unexpected ways:

Obsessions developed around cookbooks and menus from local restaurants. Some men could spend hours comparing the prices of fruits and vegetables from one newspaper to the next. Some planned now to go into agriculture. They dreamed of new careers as restaurant owners…. When they went to the movies, only the scenes with food held their interest.

“Scarcity captures the mind,” Mullainathan and Shafir write. Starving people have food on their mind to the point of irrationality. But we all act this way when we experience scarcity. “The mind,” they write, “orients automatically, powerfully, toward unfulfilled needs.”

Scarcity is like oxygen. When you don’t need it, you don’t notice it. When you do need it, however, it’s all you notice.

For the hungry, that need is food. For the busy it might be a project that needs to be finished. For the cash-strapped it might be this month’s rent payment; for the lonely, a lack of companionship. Scarcity is more than just the displeasure of having very little. It changes how we think. It imposes itself on our minds.

And when scarcity is taking up your mental cycles and putting your attention on what you lack, you can’t attend to other things. How, for instance, can you learn?

(There was) a school in New Haven that was located next to a noisy railroad line. To measure the impact of this noise on academic performance, two researchers noted that only one side of the school faced the tracks, so the students in classrooms on that side were particularly exposed to the noise but were otherwise similar to their fellow students. They found a striking difference between the two sides of the school. Sixth graders on the train side were a full year behind their counterparts on the quieter side. Further evidence came when the city, prompted by this study, installed noise pads. The researchers found this erased the difference: now students on both sides of the building performed at the same level.

Cognitive load matters. Mullainathan and Shafir believe that scarcity imposes a similar mental tax, impairing our ability to perform well, and exercise self-control.

We are all susceptible to “the planning fallacy,” which means that we’re too optimistic about how long it will take to complete a project. Busy people, however, are more vulnerable to this fallacy. Because they are focused on everything they must currently do, they are “more distracted and overwhelmed—a surefire way to misplan.” “The underlying problem,” writes Cass Sunstein in his review for the New York Review of Books, “is that when people tunnel, they focus on their immediate problem; ‘knowing you will be hungry next month does not capture your attention the same way that being hungry today does.’ A behavioral consequence of scarcity is “juggling,” which prevents long-term planning.”

When we have abundance, we don’t have as much depletion. Wealthy people can weather a shock without turning their lives upside-down. The mental energy needed to prevail may be substantial, but it will not create a feeling of scarcity.

Imagine a day at work where your calendar is sprinkled with a few meetings and your to-do list is manageable. You spend the unscheduled time by lingering at lunch or at a meeting or calling a colleague to catch up. Now, imagine another day at work where your calendar is chock-full of meetings. What little free time you have must be sunk into a project that is overdue. In both cases time was physically scarce. You had the same number of hours at work and you had more than enough activities to fill them. Yet in one case you were acutely aware of scarcity, of the finiteness of time; in the other it was a distant reality, if you felt it at all. The feeling of scarcity is distinct from its physical reality.

Mullainathan and Shafir sum up their argument:

In a way, our argument in this book is quite simple. Scarcity captures our attention, and this provides a narrow benefit: we do a better job of managing pressing needs. But more broadly, it costs us: we neglect other concerns, and we become less effective in the rest of life. This argument not only helps explain how scarcity shapes our behaviors; it also produces some surprising results and sheds new light on how we might go about managing our scarcity.

In a way, this explains why diets never work.

Scarcity: Why Having Too Little Means So Much goes on to discuss some of the possible ways to mitigate scarcity using defaults and reminders.

Forbes Interview

I was recently interviewed in Forbes.

Shane Parrish is on a mission to make you think, and think better. With over 30,000 subscribers — and that number growing quickly — Shane runs Farnam Street, an intellectual hub of curated “interestingness” that covers topics like human misjudgment, decision making, strategy, and philosophy. He exposes his readers to big ideas from multiple disciplines, adding tools to their problem-solving toolbox that improve decision making.

As an avid reader of FS myself, I recently caught up with Shane to discuss its genesis, how he works through a problem, why he was frustrated with his education, and, of course, Justin Bieber.

Here are two excerpts, in particular, I think you’ll enjoy.

The first is on my experiences doing an MBA.

Beshore: What was the core problem with your MBA program?

Parrish: There is a big difference between knowing what something is called and understanding. My MBA was all about vocabulary. For me, it was too much memorizing and regurgitating. …

The second one focuses on incorporating mental models into your thinking.

Beshore: How can we use these mental models to improve our decision making?

Parrish: When you come across a difficult decision, you really want to have a double filter that shifts your mind from reactive to rational. The first filter is running through your mental models and determining the factors that govern the situation. If I look at this through the lens of evolution, what do I see? What about supply and demand? What are the incentives?

The second filter is how you might be fooling yourself. What’s happening subconsciously? Am I only looking at a small subset of data? Am I in love with my solution? Am I biased by authority?

One of the added benefits of this approach is that when you make a bad decision, and you will, you now have a mental framework where you can account for your mistake in the future. If you failed to consider something you should have, you can easily identify it and account for it. So you’re always getting incrementally better and, over a long life, those increments will make a huge difference.

Beshore: Can you give a hypothetical example of how this system of thinking might play out?

Parrish: Looking at problems through a single discipline often leads to the wrong conclusion. For example, let’s say you’re the CFO of a textile company. The industry is not profitable and plagued with overcapacity, with only one company posting profits in the last 12 months. Your board is pressing you to demonstrate a credible path to profits.

A salesman shows up at your door one day, offering new processing equipment that is 50 percent more efficient. It will save your company a ton of money, improve margins, and pay for itself within only a few years. You verify his claims and determine that by purchasing this equipment, you’d get a 20 percent pre-tax return on your investment. Should you do it? Most people would say yes, but I’d say no.

If you only look through a financial lens, it seems to make sense. But that’s not going far enough. The second question is, “Where will the savings go?” What will likely happen is you’ll install this machine and either keep prices the same (to improve margins) or lower prices (to gain market share). But the salesman is already on the way to your competitors. Only now, he can say they must install this to stay competitive. The salesman points to your increasing market share or big margins, ensuring your competitor purchases the same equipment. Eventually, everyone has the same equipment, plummeting prices. Things go back to the way they were, only now you have more capital invested in the business. But that salesman will be back next year with a newer, improved version.

So from a mental model perspective, we can just list some of the ones at play: incentives (the salesman is not your friend), game theory, and The Red Queen Effect (where you need to put continuously more money in to keep your same position). Of course, I didn’t come up with this myself. Warren Buffett faced a similar dilemma in the early 1980s with Berkshire Hathaway’s unprofitable textile business.

Avoiding Ignorance

This is a continuation of two types of ignorance.

You can’t deal with ignorance if you can’t recognize its presence. If you’re suffering from primary ignorance, it means you probably failed to consider the possibility of being ignorant or you found ways not to see that you were ignorant.

You’re ignorant and unaware, which is worse than being ignorant and aware.

The best way to avoid this suggests Joy and Zeckhauser, is to raise self-awareness.

Ask yourself regularly: “Might I be in a state of consequential ignorance here?”

They continue:

If the answer is yes, the next step should be to estimate base rates. That should also be the next step if the starting point is recognized ignorance.

Of all situations such as this, how often has a particular outcome happening? Of course, this is often totally subjective.

and its underpinnings are elusive. It is hard to know what the sample of relevant past experiences has been, how to draw inferences from the experience of others, etc. Nevertheless, it is far better to proceed to an answer, however tenuous, than to simply miss (primary ignorance) or slight (recognized ignorance) the issue. Unfortunately, the assessment of base rates is challenging and substantial biases are likely to enter.

When we don’t recognize ignorance, the base rate is extremely underestimated. When we do recognize ignorance, we face “duelling biases; some will lead to underestimates of base rates and others to overestimates.”

Three biases come into play while estimating base rates: overconfidence, salience, and selection biases.

So we are overconfident in our estimates. We estimate things that are salient – that is, “states with which (we) have some experience or that are otherwise easily brought to mind.” And “there is a strong selection bias to recall or retell events that were surprising or of great consequence.”

Our key lesson is that as individuals proceed through life, they should always be on the lookout for ignorance. When they do recognize it, they should try to assess how likely they are to be surprised—in other words, attempt to compute the base rate. In discussing this assessment, we might also employ the term “catchall” from statistics, to cover the outcomes not specifically addressed.

It’s incredibly interesting to view literature through the lens of human decision making.

Crime and Punishment is particularly interesting as a study of primary ignorance. Raskolnikov deploys his impressive intelligence to plan the murder, believing, in his ignorance, that he has left nothing to chance. In a series of descriptions not for the squeamish or the faint-hearted, the murderer’s thoughts are laid bare as he plans the deed. We read about his skills in strategic inference and his powers of prediction about where and how he will corner his victim; his tactics at developing complementary skills (what is the precise manner in which he will carry the axe?; what strategies will help him avoid detection) are revealed.

But since Raskolnikov is making decisions under primary ignorance, his determined rationality is tightly “bounded.” He “construct[s] a simplified model of the real situation in order to deal with it; … behaves rationally with respect to this model, [but] such behavior is not even approximately optimal with respect to the real world” (Simon 1957). The second-guessing, fear, and delirium at the heart of Raskolnikov’s thinking as he struggles to gain a foothold in his inner world show the impact of a cascade of Consequential Amazing Development’s (CAD), none predicted, none even contemplated. Raskolnikov anticipated an outcome in which he would dispatch the pawnbroker and slip quietly out of her apartment. He could not have possibly predicted that her sister would show up, a characteristic CAD that challenges what Taleb (2012) calls our “illusion of predictability.”

Joy and Zeckhauser argue we can draw two conclusions.

First, we tend to downplay the role of unanticipated events, preferring instead to expect simple causal relationships and linear developments. Second, when we do encounter a CAD, we often counter with knee-jerk, impulsive decisions, the equivalent of Raskolnikov committing a second impetuous murder.

References: Ignorance: Lessons from the Laboratory of Literature (Joy and Zeckhauser).