Category: Decision Making

The OODA Loop: How Fighter Pilots Make Fast and Accurate Decisions

The OODA Loop is a four-step process for making effective decisions in high-stakes situations. It involves collecting relevant information, recognizing potential biases, deciding, and acting, then repeating the process with new information. Read on to learn how to use the OODA Loop.

When we want to learn how to make rational decisions under pressure, it can be helpful to look at the techniques people use in extreme situations. If they work in the most drastic scenarios, they have a good chance of being effective in more typical ones.

Because they’re developed and tested in the relentless laboratory of conflict, military mental models have practical applications far beyond their original context. If they didn’t work, they would be quickly replaced by alternatives. Military leaders and strategists invest a great deal of time and resources into developing decision-making processes.

One such military mental model is the OODA Loop. Developed by strategist and U.S. Air Force Colonel John Boyd, the OODA Loop is a practical concept designed to function as the foundation of rational thinking in confusing or chaotic situations. “OODA” stands for “Observe, Orient, Decide, and Act.”

What is strategy? A mental tapestry of changing intentions for harmonizing and focusing our efforts as a basis for realizing some aim or purpose in an unfolding and often unforeseen world of many bewildering events and many contending interests.” —John Boyd

***

The four parts of the OODA Loop

Let’s break down the four parts of the OODA Loop and see how they fit together.

Don’t forget the “Loop” part. The process is intended to be repeated again and again until a conflict finishes. Each repetition provides more information to inform the next one, making it a feedback loop.

1: Observe

Step one is to observe the situation with the aim of building the most accurate and comprehensive picture of it possible.

For example, a fighter pilot might consider the following factors in a broad, fluid way:

  • What is immediately affecting me?
  • What is affecting my opponent?
  • What could affect either of us later on?
  • Can I make any predictions?
  • How accurate were my prior predictions?

Information alone is insufficient. The observation stage requires converting information into an overall picture with overarching meaning that places it in context. A particularly vital skill is the capacity to identify which information is just noise and irrelevant for the current decision.

If you want to make good decisions, you need to master the art of observing your environment. For a fighter pilot, that involves factors like the weather conditions and what their opponent is doing. In your workplace, that might include factors like regulations, available resources, relationships with other people, and your current state of mind.

To give an example, consider a doctor meeting with a patient in the emergency room for the first time to identify how to treat them. Their first priority is figuring out what information they need to collect, then collecting it. They might check the patient’s records, ask other staff about the admission, ask the patient questions, check vital signs such as blood pressure, and order particular diagnostic tests. Doctors learn to pick up on subtle cues that can be telling of particular conditions, such as a patient’s speech patterns, body language, what they’ve brought with them to the hospital, and even their smell. In some cases, the absence (rather than presence) of certain cues is also important. At the same time, a doctor needs to discard irrelevant information, then put all the pieces together before they can treat the patient.

2: Orient

Orientation isn’t just a state you’re in; it’s a process. You’re always orienting.” —John Boyd

The second stage of the OODA Loop, orient, is less intuitive than the other steps. However, it’s worth taking the effort to understand it rather than skipping it. Boyd referred to it as the schwerpunkt, meaning “the main emphasis” in German.

To orient yourself is to recognize any barriers that might interfere with the other parts of the OODA Loop.

Orientation means connecting yourself with reality and seeing the world as it really is, as free as possible from the influence of cognitive biases and shortcuts. You can give yourself an edge over the competition by making sure you always orient before making a decision, instead of just jumping in.

Boyd maintained that properly orienting yourself can be enough to overcome an initial disadvantage, such as fewer resources or less information, to outsmart an opponent. He identified the following four main barriers that impede our view of objective information:

  1. Our cultural traditions – we don’t realize how much of what we consider universal behavior is actually culturally prescribed
  2. Our genetic heritage – we all have certain constraints
  3. Our ability to analyze and synthesize – if we haven’t practiced and developed our thinking skills, we tend to fall back on old habits
  4. The influx of new information – it is hard to make sense of observations when the situation keeps changing

Prior to Charlie Munger’s popularization of the concept of building a toolbox of mental models, Boyd advocated a similar approach for pilots to help them better navigate the orient stage of the OODA Loop. He recommended a process of “deductive destruction”: paying attention to your own assumptions and biases, then finding fundamental mental models to replace them.

Similar to using a decision journal, deductive destruction ensures you always learn from past mistakes and don’t keep on repeating them. In one talk, Boyd employed a brilliant metaphor for developing a latticework of mental models. He compared it to building a snowmobile, a vehicle comprising elements of several different devices, such as the caterpillar treads of a tank, skis, the outboard motor of a boat, and the handlebars of a bike.

Individually, each of these items isn’t enough to move you around. But combined they create a functional vehicle. As Boyd put it:

A loser is someone (individual or group) who cannot build snowmobiles when facing uncertainty and unpredictable change; whereas a winner is someone (individual or group) who can build snowmobiles, and employ them in an appropriate fashion, when facing uncertainty and unpredictable change.

To orient yourself, you have to build a metaphorical snowmobile by combining practical concepts from different disciplines. (For more on mental models, we literally wrote the book on them.) Although Boyd is regarded as a military strategist, he didn’t confine himself to any particular discipline. His theories encompass ideas drawn from various disciplines, including mathematical logic, biology, psychology, thermodynamics, game theory, anthropology, and physics. Boyd described his approach as a “scheme of pulling things apart (analysis) and putting them back together (synthesis) in new combinations to find how apparently unrelated ideas and actions can be related to one another.”

3: Decide

There are no surprises here. The previous two steps provide the groundwork you need to make an informed decision. If there are multiple options at hand, you need to use your observation and orientation to select one.

Boyd cautioned against first-conclusion bias, explaining that we cannot keep making the same decision again and again. This part of the loop needs to be flexible and open to Bayesian updating. In some of his notes, Boyd described this step as the hypothesis stage. The implication is that we should test the decisions we make at this point in the loop, spotting their flaws and including any issues in future observation stages

4: Act

There’s a difference between making decisions and enacting decisions. Once you make up your mind, it’s time to take action.

By taking action, you test your decision out. The results will hopefully indicate whether it was a good one or not, providing information for when you cycle back to the first part of the OODA Loop and begin observing anew.

***

Why the OODA Loop works

The ability to operate at a faster tempo or rhythm than an adversary enables one to fold the adversary back inside himself so that he can neither appreciate nor keep up with what is going on. He will become disoriented and confused.” —John Boyd

We’ve identified three key benefits of using the OODA Loop.

1: Deliberate speed

As we’ve established, fighter pilots have to make many decisions in fast succession. They don’t have time to list pros and cons or to consider every available avenue. Once the OODA Loop becomes part of their mental toolboxes, they should be able to cycle through it in a matter of seconds.

Speed is a crucial element of military decision-making. Using the OODA Loop in everyday life, we probably have a little more time than a fighter pilot would. But Boyd emphasized the value of being decisive, taking initiative, and staying autonomous. These are universal assets and apply to many situations.

2: Comfort with uncertainty

There’s no such thing as total certainty. If you’re making a decision at all, it’s because something is uncertain. But uncertainty does not always have to equate to risk.

A fighter pilot is in a precarious situation, one in which where there will be gaps in their knowledge. They cannot read the mind of the opponent and might have incomplete information about the weather conditions and surrounding environment. They can, however, take into account key factors such as the opponent’s type of airplane and what their maneuvers reveal about their intentions and level of training. If the opponent uses an unexpected strategy, is equipped with a new type of weapon or airplane, or behaves in an irrational way, the pilot must accept the accompanying uncertainty. However, Boyd belabored the point that uncertainty is irrelevant if we have the right filters in place.

If we can’t cope with uncertainty, we end up stuck in the observation stage. This sometimes happens when we know we need to make a decision, but we’re scared of getting it wrong. So we keep on reading books and articles, asking people for advice, listening to podcasts, and so on.

Acting under uncertainty is unavoidable. If we do have the right filters, we can factor uncertainty into the observation stage. We can leave a margin of error. We can recognize the elements that are within our control and those that are not.

In presentations, Boyd referred to three key principles to support his ideas: Gödel’s theorems, Heisenberg’s Uncertainty Principle, and the Second Law of Thermodynamics. Of course, we’re using these principles in a different way from their initial purpose and in a simplified, non-literal form.

Gödel’s theorems indicate any mental model we have of reality will omit certain information and that Bayesian updating must be used to bring it in line with reality. For fighter pilots, their understanding of what is going on during a battle will always have gaps. Identifying this fundamental uncertainty gives it less power over us.

The second concept Boyd referred to is Heisenberg’s Uncertainty Principle. In its simplest form, this principle describes the limit of the precision with which pairs of physical properties can be understood. We cannot know the position and the velocity of a body at the same time. We can know either its location or its speed, but not both.

Boyd moved the concept of the Uncertainty Principle from particles to planes. If a pilot focuses too hard on where an enemy plane is, they will lose track of where it is going and vice versa. Trying harder to track the two variables will actually lead to more inaccuracy!

Finally, Boyd made use of the Second Law of Thermodynamics. In a closed system, entropy always increases and everything moves towards chaos. Energy spreads out and becomes disorganized.

Although Boyd’s notes do not specify the exact applications, his inference appears to be that a fighter pilot must be an open system or they will fail. They must draw “energy” (information) from outside themselves or the situation will become chaotic. They should also aim to cut their opponent off, forcing them to become a closed system.

3: Unpredictability

When you act fast enough, other people view you as unpredictable. They can’t figure out the logic behind your decisions.

Boyd recommended making unpredictable changes in speed and direction, writing, “We should operate at a faster tempo than our adversaries or inside our adversaries[’] time scales.…Such activity will make us appear ambiguous (non predictable) [and] thereby generate confusion and disorder among our adversaries.” He even helped design planes that were better equipped to make those unpredictable changes.

For the same reason that you can’t run the same play seventy times in a football game, rigid military strategies often become useless after a few uses, or even one iteration, as opponents learn to recognize and counter them. The OODA Loop can be endlessly used because it is a formless strategy, unconnected to any particular maneuvers.

We know that Boyd was influenced by Sun Tzu (he owned seven thoroughly annotated copies of The Art of War) and drew many ideas from the ancient strategist. Sun Tzu depicts war as a game of deception where the best strategy is that which an opponent cannot preempt.

***

Forty Second Boyd

Let your plans be dark and impenetrable as night, and when you move, fall like a thunderbolt.” —Sun Tzu

Boyd was no armchair strategist. He developed his ideas through extensive experience as a fighter pilot. His nickname “Forty Second Boyd” speaks to his expertise: Boyd could win any aerial battle in less than forty seconds.

In a tribute written after Boyd’s death, General C.C. Krulak described him as “a towering intellect who made unsurpassed contributions to the American art of war. Indeed, he was one of the central architects of the reform of military thought.…From John Boyd we learned about competitive decision-making on the battlefield—compressing time, using time as an ally.

Reflecting Robert Greene’s maxim that everything is material, Boyd spent his career observing people and organizations. How do they adapt to changeable environments in conflicts, business, and other situations?

Over time, he deduced that these situations are characterized by uncertainty. Dogmatic, rigid theories are unsuitable for chaotic situations. Rather than trying to rise through the military ranks, Boyd focused on using his position as a colonel to compose a theory of the universal logic of war.

Boyd was known to ask his mentees the poignant question, “Do you want to be someone, or do you want to do something?” In his own life, he certainly focused on the latter path and, as a result, left us ideas with tangible value. The OODA Loop is just one of many.

Boyd developed the OODA Loop with fighter pilots in mind, but like all good mental models, it works in other fields beyond combat. It’s used in intelligence agencies. It’s used by lawyers, doctors, businesspeople, politicians, law enforcement, marketers, athletes, coaches, and more.

If you have to work fast, you might want to learn a thing or two from fighter pilots. For them, a split-second of hesitation can cost them their lives. As anyone who has ever watched Top Gun knows, pilots have a lot of decisions and processes to juggle when they’re in dogfights (close-range aerial battles). Pilots move at high speeds and need to avoid enemies while tracking them and keeping a contextual knowledge of objectives, terrains, fuel, and other key variables.

And as any pilot who has been in one will tell you, dogfights are nasty. No one wants them to last longer than necessary because every second increases the risk of something going wrong. Pilots have to rely on their decision-making skills—they can’t just follow a schedule or to-do list to know what to do.

***

Applying the OODA Loop

We can’t just look at our own personal experiences or use the same mental recipes over and over again; we’ve got to look at other disciplines and activities and relate or connect them to what we know from our experiences and the strategic world we live in.” —John Boyd

In sports, there is an adage that carries over to business quite well: “Speed kills.” If you are able to be nimble, assess the ever-changing environment, and adapt quickly, you’ll always carry the advantage over any opponents.

Start applying the OODA Loop to your day-to-day decisions and watch what happens. You’ll start to notice things that you would have been oblivious to before. Before jumping to your first conclusion, you’ll pause to consider your biases, take in additional information, and be more thoughtful of consequences.

As with anything you practice, if you do it right, the more you do it, the better you’ll get. You’ll start making better decisions to your full potential. You’ll see more rapid progress. And as John Boyd would prescribe, you’ll start to do something in your life, and not just be somebody.

***

We hope you’ve enjoyed our three week exploration of perspectives on decision making. We think there is value in juxtaposing different ideas to help us learn. Stay tuned for more topic specific series in the future.

Avoiding Bad Decisions

Sometimes success is just about avoiding failure.

At FS, we help people make better decisions without needing to rely on getting lucky. One aspect of decision-making that’s rarely talked about is how to avoid making bad decisions.

Here are five of the biggest reasons we make bad decisions.

***

1. We’re unintentionally stupid

We like to think that we can rationally process information like a computer, but we can’t. Cognitive biases explain why we made a bad decision but rarely help us avoid them in the first place. It’s better to focus on these warning signs that signal something is about to go wrong.

Warning signs you’re about to unintentionally do something stupid:

  • You’re tired, emotional, in a rush, or distracted.
  • You’re operating in a group or working with an authority figure.

The rule: Never make important decisions when you’re tired, emotional, distracted, or in a rush.

2. We solve the wrong problem

The first person to state the problem rarely has the best insight into the problem. Once a problem is thrown out on the table, however, our type-A problem-solving nature kicks in and forgets to first ask if we’re solving the right problem.

Warning signs you’re solving the wrong problem:

  • You let someone else define the problem for you.
  • You’re far away from the problem.
  • You’re thinking about the problem at only one level or through a narrow lens.

The rule: Never let anyone define the problem for you.

3. We use incorrect or insufficient information

We like to believe that people tell us the truth. We like to believe the people we talk to understand what they are talking about. We like to believe that we have all the information.

Warning signs you have incorrect or insufficient information:

  • You’re speaking to someone who spoke to someone who spoke to someone. Someone will get in trouble when the truth comes out.
  • You’re reading about it in the news.

The rule: Seek out information from someone as close to the source as possible, because they’ve earned their knowledge and have an understanding that you don’t. When information is filtered (and it often is), first consider the incentives involved and then think of the proximity to earned knowledge.

4. We fail to learn

You know the person that sits beside you at work that has twenty years of experience but keeps making the same mistakes over and over? They don’t have twenty years of experience—they have one year of experience repeated twenty times. If you can’t learn, you can’t get better.

Most of us can observe and react accordingly. But to truly learn from our experiences, we must reflect on our reactions. Reflection has to be part of your process, not something you might do if you have time. Don’t use the excuse of being too busy or get too invested in protecting your ego. In short, we can’t learn from experience without reflection. Only reflection allows us to distill experience into something we can learn from to make better decisions in the future.

Warning signs you’re not learning:

  • You’re too busy to reflect.
  • You don’t keep track of your decisions.
  • You can’t calibrate your own decision-making.

The rule: Be less busy. Keep a learning journal. Reflect every day.

5. We focus on optics over outcomes

Our evolutionary programming conditions us to do what’s easy over what’s right. After all, it’s often easier to signal being virtuous than to actually be virtuous.

Warning signs you’re focused on optics:

  • You’re thinking about how you’ll defend your decision.
  • You’re knowingly choosing what’s defendable over what’s right.
  • You’d make a different decision if you owned the company.
  • You catch yourself saying this is what your boss would want.

The rule: Act as you would want an employee to act if you owned the company.

***

Avoiding bad decisions is just as important as making good ones. Knowing the warning signs and having a set of rules for your decision-making process limits the amount of luck you need to get good outcomes.

Your Thinking Rate Is Fixed

You can’t force yourself to think faster. If you try, you’re likely to end up making much worse decisions. Here’s how to improve the actual quality of your decisions instead of chasing hacks to speed them up.

If you’re a knowledge worker, as an ever-growing proportion of people are, the product of your job is decisions.

Much of what you do day to day consists of trying to make the right choices among competing options, meaning you have to process large amounts of information, discern what’s likely to be most effective for moving towards your desired goal, and try to anticipate potential problems further down the line. And all the while, you’re operating in an environment of uncertainty where anything could happen tomorrow.

When the product of your job is your decisions, you might find yourself wanting to be able to make more decisions more quickly so you can be more productive overall.

Chasing speed is a flawed approach. Because decisions—at least good ones—don’t come out of thin air. They’re supported by a lot of thinking.

While experience and education can grant you the pattern-matching abilities to make some kinds of decisions using intuition, you’re still going to run into decisions that require you to sit and consider the problem from multiple angles. You’re still going to need to schedule time to do nothing but think. Otherwise making more decisions will make you less productive overall, not more, because your decisions will suck.

Here’s a secret that might sound obvious but can actually transform the way you work: you can’t force yourself to think faster. Our brains just don’t work that way. The rate at which you make mental discernments is fixed.

Sure, you can develop your ability to do certain kinds of thinking faster over time. You can learn new methods for decision-making. You can develop your mental models. You can build your ability to focus. But if you’re trying to speed up your thinking so you can make an extra few decisions today, forget it.

***

Beyond the “hurry up” culture

Management consultant Tom DeMarco writes in Slack: Getting Past Burnout, Busywork, and the Myth of Total Efficiency that many knowledge work organizations have a culture where the dominant message at all times is to hurry up.

Everyone is trying to work faster at all times, and they pressure everyone around them to work faster, too. No one wants to be perceived as a slacker. The result is that managers put pressure on their subordinates through a range of methods. DeMarco lists the following examples:

  • “Turning the screws on delivery dates (aggressive scheduling)
  • Loading on extra work
  • Encouraging overtime
  • Getting angry when disappointed
  • Noting one subordinate’s extraordinary effort and praising it in the presence of others
  • Being severe about anything other than superb performance
  • Expecting great things of all your workers
  • Railing against any apparent waste of time
  • Setting an example yourself (with the boss laboring so mightily there is certainly no time for anyone else to goof off)
  • Creating incentives to encourage desired behavior or results.”

All of these things increase pressure in the work environment and repeatedly reinforce the “hurry up!” message. They make managers feel like they’re moving things along faster. That way if work isn’t getting done, it’s not their fault. But, DeMarco writes, they don’t lead to meaningful changes in behavior that make the whole organization more productive. Speeding up often results in poor decisions that create future problems.

The reason more pressure doesn’t mean better productivity is that the rate at which we think is fixed.

We can’t force ourselves to start making faster decisions right now just because we’re faced with an unrealistic deadline. DeMarco writes, “Think rate is fixed. No matter what you do, no matter how hard you try, you can’t pick up the pace of thinking.

If you’re doing a form of physical labor, you can move your body faster when under pressure. (Of course, if it’s too fast, you’ll get injured or won’t be able to sustain it for long.)

If you’re a knowledge worker, you can’t pick up the pace of mental discriminations just because you’re under pressure. Chances are good that you’re already going as fast as you can. Because guess what? You can’t voluntarily slow down your thinking, either.

***

The limits of pressure

Faced with added stress and unable to accelerate our brains instantaneously, we can do any of three things:

  • “Eliminate wasted time.
  • Defer tasks that are not on the critical path.
  • Stay late.”

Even if those might seem like positive things, they’re less advantageous than they appear at first glance. Their effects are marginal at best. The smarter and more qualified the knowledge worker, the less time they’re likely to be wasting anyway. Most people don’t enjoy wasting time. What you’re more likely to end up eliminating is valuable slack time for thinking.

Deferring non-critical tasks doesn’t save any time overall, it just pushes work forwards—to the point where those tasks do become critical. Then something else gets deferred.

Staying late might work once in a while. Again, though, its effects are limited. If we keep doing it night after night, we run out of energy, our personal lives suffer, and we make worse decisions as a result.

None of the outcomes of increasing pressure result in more or better decisions. None of them speed up the rate at which people think. Even if an occasional, tactical increase in pressure (whether it comes from the outside or we choose to apply it to ourselves) can be effective, ongoing pressure increases are unsustainable in the long run.

***

Think rate is fixed

It’s incredibly important to truly understand the point DeMarco makes in this part of Slack: the rate at which we process information is fixed.

When you’re under pressure, the quality of your decisions plummets. You miss possible angles, you don’t think ahead, you do what makes sense now, you panic, and so on. Often, you make a snap judgment then grasp for whatever information will support it for the people you work with. You don’t have breathing room to stress-test your decisions.

The clearer you can think, the better your decisions will be. Trying to think faster can only cloud your judgment. It doesn’t matter how many decisions you make if they’re not good ones. As DeMarco reiterates throughout the book, you can be efficient without being effective.

Try making a list of the worst decisions you’ve made so far in your career. There’s a good chance most of them were made under intense pressure or without taking much time over them.

At Farnam Street, we write a lot about how to make better decisions, and we share a lot of tools for better thinking. We made a whole course on decision-making. But none of these resources are meant to immediately accelerate your thinking. Many of them require you to actually slow down a whole lot and spend more time on your decisions. They improve the rate at which you can do certain kinds of thinking, but it’s not going to be an overnight process.

***

Upgrading your brain

Some people read one of our articles or books about mental models and complain that it’s not an effective approach because it didn’t lead to an immediate improvement in their thinking. That’s unsurprising; our brains don’t work like that. Integrating new, better approaches takes a ton of time and repetition, just like developing any other skill. You have to keep on reflecting and making course corrections.

At the end of the day, your brain is going to go where it wants to go. You’re going to think the way you think. However much you build awareness of how the world works and learn how to reorient, you’re still, to use Jonathan Haidt’s metaphor from The Righteous Mind, a tiny rider atop a gigantic elephant. None of us can reshape how we think overnight.

Making good decisions is hard work. There’s a limit to how many decisions you can make in a day before you need a break. On top of that, many knowledge workers are in fields where the most relevant information has a short half-life. Making good decisions requires constant learning and verifying what you think you know.

If you want to make better decisions, you need to do everything you can to reduce the pressure you’re under. You need to let your brain take whatever time it needs to think through the problem at hand. You need to get out of a reactive mode, recognize when you need to pause, and spend more time looking at problems.

A good metaphor is installing an update to the operating system on your laptop. Would you rather install an update that fixes bugs and improves existing processes, or one that just makes everything run faster? Obviously, you’d prefer the former. The latter would just lead to more crashes. The same is true for updating your mental operating system.

Stop trying to think faster. Start trying to think better.

Solve Problems Before They Happen by Developing an “Inner Sense of Captaincy”

Too often we reward people who solve problems while ignoring those who prevent them in the first place. This incentivizes creating problems. According to poet David Whyte, the key to taking initiative and being proactive is viewing yourself as the captain of your own “voyage of work.”

If we want to get away from glorifying those who run around putting out fires, we need to cultivate an organizational culture that empowers everyone to act responsibly at the first sign of smoke.

How do we make that shift?

We can start by looking at ourselves and how we consider the voyage that is our work. When do we feel fulfillment? Is it when we swoop in to save the day and everyone congratulates us? It’s worth asking why, if we think something is worth saving, we don’t put more effort into protecting it ahead of time.

In Crossing the Unknown Sea, poet David Whyte suggests that we should view our work as a lifelong journey. In particular, he frames it as a sea voyage in which the greatest rewards lie in what we learn through the process, as opposed to the destination.

Like a long sea voyage, the nature of our work is always changing. There are stormy days and sunny ones. There are days involving highs of delight and lows of disaster. All of this happens against the backdrop of events in our personal lives and the wider world with varying levels of influence.

On a voyage, you need to look after your boat. There isn’t always time to solve problems after they happen. You need to learn how to preempt them or risk a much rougher journey—or even the end of it.

Whyte refers to the practice of taking control of your voyage as “developing an inner sense of captaincy,” offering a metaphor we can all apply to our work. Developing an inner sense of captaincy is good for both us and the organizations we work in. We end up with more agency over our own lives, and our organizations waste fewer resources. Whyte’s story of how he learned this lesson highlights why that’s the case.

***

A moment of reckoning

Any life, and any life’s work, is a hidden journey, a secret code, deciphered in fits and starts. The details only given truth by the whole, and the whole dependent on the detail.

Shortly after graduating, Whyte landed a dream job working as a naturalist guide on board a ship in the Galapagos Islands. One morning, he awoke and could tell at once that the vessel had drifted from its anchorage during the night. Whyte leaped up to find the captain fast asleep and the boat close to crashing into a cliff. Taking control of it just in time, he managed to steer himself and the other passengers back to safety—right as the captain awoke. Though they were safe, he was profoundly shaken both by the near miss and the realization that their leader had failed.

At first, Whyte’s reaction to the episode was to feel a smug contempt for the captain who had “slept through not only the anchor dragging but our long, long, nighttime drift.” The captain had failed to predict the problem or notice when it started. If Whyte hadn’t awakened, everyone on the ship could have died.

But something soon changed in his perspective. Whyte knew the captain was new and far less familiar with that particular boat than himself and the other crew member. Every boat has its quirks, and experience counts for more than seniority when it comes to knowing them. He’d also felt sure the night before that they needed to put down a second anchor and knew they “should have dropped another anchor without consultation, as crews are wont to do when they do not want to argue with their captain. We should have woken too.” He writes that “this moment of reckoning under the lava cliff speaks to the many dangerous arrivals in a life of work and to the way we must continually forge our identities through our endeavors.”

Whyte’s experience contains lessons with wide applicability for those of us on dry land. The idea of having an inner sense of captaincy means understanding the overarching goals of your work and being willing to make decisions that support them, even if something isn’t strictly your job or you might not get rewarded for it, or sometimes even if you don’t have permission.

When you play the long game, you’re thinking of the whole voyage, not whether you’ll get a pat on the back today.

***

Skin in the game

It’s all too easy to buy into the view that leaders have full responsibility for everything that happens, especially disasters. Sometimes in our work, when we’re not in a leadership position, we see a potential problem or an unnoticed existing one but choose not to take action. Instead, we stick to doing whatever we’ve been told to do because that feels safer. If it’s important, surely the person in charge will deal with it. If not, that’s their problem. Anyway, there’s already more than enough to do.

Leaders give us a convenient scapegoat when things go wrong. However, when we assume all responsibility lies with them, we don’t learn from our mistakes. We don’t have “our own personal compass, a direction, a willingness to meet life unmediated by any cushioning parental presence.

At some point, things do become our problem. No leader can do everything and see everything. The more you rise within an organization, the more you need to take initiative. If a leader can’t rely on their subordinates to take action when they see a potential problem, everything will collapse.

When we’ve been repeatedly denied agency by poor leadership and seen our efforts fall flat, we may sense we lack control. Taking action no longer feels natural. However, if we view our work as a voyage that helps us change and grow, it’s obvious why we need to overcome learned helplessness. We can’t abdicate all responsibility and blame other people for what we chose to ignore in the first place (as Whyte puts it, “The captain was there in all his inherited and burdened glory and thus convenient for the blame”). By understanding how our work helps us change and grow, we develop skin in the game.

On a ship, everyone is in it together. If something goes wrong, they’re all at risk. And it may not be easy or even possible to patch up a serious problem in the middle of the sea. As a result, everyone needs to pay attention and act on anything that seems amiss. Everyone needs to take responsibility for what happens, as Whyte goes on to detail:

“No matter that the inherited world of the sea told us that the captain is the be-all and end-all of all responsibility, we had all contributed to the lapse, the inexcusable lapse. The edge is no place for apportioning blame. If we had merely touched that cliff, we would have been for the briny deep, crew and passengers alike. The undertow and the huge waves lacerating against that undercut, barnacle-encrusted fortress would have killed us all.”

Having an inner sense of captaincy means viewing ourselves as the ones in charge of our voyage of work. It means not acting as if there are certain areas where we are incapacitated, or ignoring potential problems, just because someone else has a particular title.

***

Space and support to create success

Developing an inner sense of captaincy is not about compensating for an incompetent leader—nor does it mean thinking we always know best. The better someone is at leading people, the more they create the conditions for their team to take initiative and be proactive about preventing problems. They show by example that they inhabit a state rather than a particular role. A stronger leader can mean a more independent team.

Strong leaders instill autonomy by teaching and supervising processes with the intention of eventually not needing to oversee them. Captaincy is a way of being. It is embodied in the role of captain, but it is available to everyone. For a crew to develop it, the captain needs to step back a little and encourage them to take responsibility for outcomes. They can test themselves bit by bit, building up confidence. When people feel like it’s their responsibility to contribute to overall success, not just perform specific tasks, they can respond to the unexpected without waiting for instructions. They become ever more familiar with what their organization needs to stay healthy and use second-order thinking so potential problems are more noticeable before they happen.

Whyte realized that the near-disaster had a lot to do with their previous captain, Raphael. He was too good at his job, being “preternaturally alert and omnipresent, appearing on deck at the least sign of trouble.” The crew felt comfortable, knowing they could always rely on Raphael to handle any problems. Although this worked well at the time, once he left and they were no longer in such safe hands they were unused to taking initiative. Whyte explains:

Raphael had so filled his role of captain to capacity that we ourselves had become incapacitated in one crucial area: we had given up our own inner sense of captaincy. Somewhere inside of us, we had come to the decision that ultimate responsibility lay elsewhere.

Being a good leader isn’t about making sure your team doesn’t experience failure. Rather, it’s giving everyone the space and support to create success.

***

The voyage of work

Having an inner sense of captaincy means caring about outcomes, not credit or blame. When Whyte realized that he should have dropped a second anchor the night before the near miss, he would have been doing something that ideally no one other than the crew, or even just him, would have known about. The captain and passengers would have enjoyed an untroubled night and woken none the wiser.

If we prioritize getting good outcomes, our focus shifts from solving existing problems to preventing problems from happening in the first place. We put down a second anchor so the boat doesn’t drift, rather than steering it to safety when it’s about to crash. After all, we’re on the boat too.

Another good comparison is picking up litter. The less connected to and responsible for a place we feel, the less likely we might be to pick up trash lying on the ground. In our homes, we’re almost certain to pick it up. If we’re walking along our street or in our neighborhood, it’s a little less likely. In a movie theater or bar when we know it’s someone’s job to pick up trash, we’re less likely to bother. What’s the equivalent to leaving trash on the ground in your job?

Most organizations don’t incentivize prevention because it’s invisible. Who knows what would have happened? How do you measure something that doesn’t exist? After all, problem preventers seem relaxed. They often go home on time. They take lots of time to think. We don’t know how well they would deal with conflict, because they never seem to experience any. The invisibility of the work they do to prevent problems in the first place makes it seem like their job isn’t challenging.

When we promote problem solvers, we incentivize having problems. We fail to unite everyone towards a clear goal. Because most organizations reward problem solvers, it can seem like a better idea to let things go wrong, then fix them after. That’s how you get visibility. You run from one high-level meeting to the next, reacting to one problem after another.

It’s great to have people to solve those problems but it is better not to have them in the first place. Solving problems generally requires more resources than preventing them, not to mention the toll it takes on our stress levels. As the saying goes, an ounce of prevention is worth a pound of cure.

An inner sense of captaincy on our voyage of work is good for us and for our organizations. It changes how we think about preventing problems. It becomes a part of an overall voyage, an opportunity to build courage and face fears. We become more fully ourselves and more in touch with our nature. Whyte writes that “having the powerful characteristics of captaincy or leadership of any form is almost always an outward sign of a person inhabiting their physical body and the deeper elements of their own nature.”

12 Life Lessons From Mathematician and Philosopher Gian-Carlo Rota

The mathematician and philosopher Gian-Carlo Rota spent much of his career at MIT, where students adored him for his engaging, passionate lectures. In 1996, Rota gave a talk entitled “Ten Lessons I Wish I Had Been Taught,” which contains valuable advice for making people pay attention to your ideas.

Many mathematicians regard Rota as single-handedly responsible for turning combinatorics into a significant field of study. He specialized in functional analysis, probability theory, phenomenology, and combinatorics. His 1996 talk, “Ten Lessons I Wish I Had Been Taught,” was later printed in his book, Indiscrete Thoughts.

Rota began by explaining that the advice we give others is always the advice we need to follow most. Seeing as it was too late for him to follow certain lessons, he decided he would share them with the audience. Here, we summarize twelve insights from Rota’s talk—which are fascinating and practical, even if you’re not a mathematician.

***

Every lecture should make only one point

“Every lecture should state one main point and repeat it over and over, like a theme with variations. An audience is like a herd of cows, moving slowly in the direction they are being driven towards.”

When we wish to communicate with people—in an article, an email to a coworker, a presentation, a text to a partner, and so on—it’s often best to stick to making one point at a time. This matters all the more so if we’re trying to get our ideas across to a large audience.

If we make one point well enough, we can be optimistic about people understanding and remembering it. But if we try to fit too much in, “the cows will scatter all over the field. The audience will lose interest and everyone will go back to the thoughts they interrupted in order to come to our lecture.

***

Never run over time

“After fifty minutes (one microcentury as von Neumann used to say), everybody’s attention will turn elsewhere even if we are trying to prove the Riemann hypothesis. One minute over time can destroy the best of lectures.”

Rota considered running over the allotted time slot to be the worst thing a lecturer could do. Our attention spans are finite. After a certain point, we stop taking in new information.

In your work, it’s important to respect the time and attention of others. Put in the extra work required for brevity and clarity. Don’t expect them to find what you have to say as interesting as you do. Condensing and compressing your ideas both ensures you truly understand them and makes them easier for others to remember.

***

Relate to your audience

“As you enter the lecture hall, try to spot someone in the audience whose work you have some familiarity with. Quickly rearrange your presentation so as to manage to mention some of that person’s work.”

Reciprocity is remarkably persuasive. Sometimes, how people respond to your work has as much to do with how you respond to theirs as it does with the work itself. If you want people to pay attention to your work, always give before you take and pay attention to theirs first. Show that you see them and appreciate them. Rota explains that “everyone in the audience has come to listen to your lecture with the secret hope of hearing their work mentioned.

The less acknowledgment someone’s work has received, the more of an impact your attention is likely to have. A small act of encouragement can be enough to deter someone from quitting. With characteristic humor, Rota recounts:

“I have always felt miffed after reading a paper in which I felt I was not being given proper credit, and it is safe to conjecture that the same happens to everyone else. One day I tried an experiment. After writing a rather long paper, I began to draft a thorough bibliography. On the spur of the moment I decided to cite a few papers which had nothing whatsoever to do with the content of my paper to see what might happen.

Somewhat to my surprise, I received letters from two of the authors whose papers I believed were irrelevant to my article. Both letters were written in an emotionally charged tone. Each of the authors warmly congratulated me for being the first to acknowledge their contribution to the field.”

***

Give people something to take home

“I often meet, in airports, in the street, and occasionally in embarrassing situations, MIT alumni who have taken one or more courses from me. Most of the time they admit that they have forgotten the subject of the course and all the mathematics I thought I had taught them. However, they will gladly recall some joke, some anecdote, some quirk, some side remark, or some mistake I made.”

When we have a conversation, read a book, or listen to a talk, the sad fact is that we are unlikely to remember much of it even a few hours later, let alone years after the event. Even if we enjoyed and valued it, only a small part will stick in our memory.

So when you’re communicating with people, try to be conscious about giving them something to take home. Choose a memorable line or idea, create a visual image, or use humor in your work.

For example, in The Righteous Mind, Jonathan Haidt repeats many times that the mind is like a tiny rider on a gigantic elephant. The rider represents controlled mental processes, while the elephant represents automatic ones. It’s a distinctive image, one readers are quite likely to take home with them.

***

Make sure the blackboard is spotless

“By starting with a spotless blackboard, you will subtly convey the impression that the lecture they are about to hear is equally spotless.”

Presentation matters. The way our work looks influences how people perceive it. Taking the time to clean our equivalent of a blackboard signals that we care about what we’re doing and consider it important.

In “How To Spot Bad Science,” we noted that one possible sign of bad science is that the research is presented in a thoughtless, messy way. Most researchers who take their work seriously will put in the extra effort to ensure it’s well presented.

***

Make it easy for people to take notes

“What we write on the blackboard should correspond to what we want an attentive listener to take down in his notebook. It is preferable to write slowly and in a large handwriting, with no abbreviations. Those members of the audience who are taking notes are doing us a favor, and it is up to us to help them with their copying.”

If a lecturer is using slides with writing on them instead of a blackboard, Rota adds that they should give people time to take notes. This might mean repeating themselves in a few different ways so each slide takes longer to explain (which ties in with the idea that every lecture should make only one point). Moving too fast with the expectation that people will look at the slides again later is “wishful thinking.”

When we present our work to people, we should make it simple for them to understand our ideas on the spot. We shouldn’t expect them to revisit it later. They might forget. And even if they don’t, we won’t be there to answer questions, take feedback, and clear up any misunderstandings.

***

Share the same work multiple times

Rota learned this lesson when he bought Collected Papers, a volume compiling the publications of mathematician Frederic Riesz. He noted that “the editors had gone out of their way to publish every little scrap Riesz had ever published.” Putting them all in one place revealed that he had published the same ideas multiple times:

Riesz would publish the first rough version of an idea in some obscure Hungarian journal. A few years later, he would send a series of notes to the French Academy’s Comptes Rendus in which the same material was further elaborated. A few more years would pass, and he would publish the definitive paper, either in French or in English.

Riesz would also develop his ideas while lecturing. Explaining the same subject again and again for years allowed him to keep improving it until he was ready to publish. Rota notes, “No wonder the final version was perfect.

In our work, we might feel as if we need to have fresh ideas all of the time and that anything we share with others needs to be a finished product. But sometimes we can do our best work through an iterative process.

For example, a writer might start by sharing an idea as a tweet. This gets a good response, and the replies help them expand it into a blog post. From there they keep reworking the post over several years, making it longer and more definite each time. They give a talk on the topic. Eventually, it becomes a book.

Award-winning comedian Chris Rock prepares for global tours by performing dozens of times in small venues for a handful of people. Each performance is an experiment to see which jokes land, which ones don’t, and which need tweaking. By the time he’s performed a routine forty or fifty times, making it better and better, he’s ready to share it with huge audiences.

Another reason to share the same work multiple times is that different people will see it each time and understand it in different ways:

“The mathematical community is split into small groups, each one with its own customs, notation, and terminology. It may soon be indispensable to present the same result in several versions, each one accessible to a specific group; the price one might have to pay otherwise is to have our work rediscovered by someone who uses a different language and notation, and who will rightly claim it as his own.”

Sharing your work multiple times thus has two benefits. The first is that the feedback allows you to improve and refine your work. The second is that you increase the chance of your work being definitively associated with you. If the core ideas are strong enough, they’ll shine through even in the initial incomplete versions.

***

You are more likely to be remembered for your expository work

“Allow me to digress with a personal reminiscence. I sometimes publish in a branch of philosophy called phenomenology. . . . It so happens that the fundamental treatises of phenomenology are written in thick, heavy philosophical German. Tradition demands that no examples ever be given of what one is talking about. One day I decided, not without serious misgivings, to publish a paper that was essentially an updating of some paragraphs from a book by Edmund Husserl, with a few examples added. While I was waiting for the worst at the next meeting of the Society for Phenomenology and Existential Philosophy, a prominent phenomenologist rushed towards me with a smile on his face. He was full of praise for my paper, and he strongly encouraged me to further develop the novel and original ideas presented in it.”

Rota realized that many of the mathematicians he admired the most were known more for their work explaining and building upon existing knowledge, as opposed to their entirely original work. Their extensive knowledge of their domain meant they could expand a little beyond their core specialization and synthesize charted territory.

For example, David Hilbert was best known for a textbook on integral equations which was “in large part expository, leaning on the work of Hellinger and several other mathematicians whose names are now forgotten.” William Feller was known for an influential treatise on probability, with few recalling his original work in convex geometry.

One of our core goals at Farnam Street is to share the best of what other people have already figured out. We all want to make original and creative contributions to the world. But the best ideas that are already out there are quite often much more useful than what we can contribute from scratch.

We should never be afraid to stand on the shoulders of giants.

***

Every mathematician has only a few tricks

“. . . mathematicians, even the very best, also rely on a few tricks which they use over and over.”

Upon reading the complete works of certain influential mathematicians, such as David Hilbert, Rota realized that they always used the same tricks again and again.

We don’t need to be amazing at everything to do high-quality work. The smartest and most successful people are often only good at a few things—or even one thing. Their secret is that they maximize those strengths and don’t get distracted. They define their circle of competence and don’t attempt things they’re not good at if there’s any room to double down further on what’s already going well.

It might seem as if this lesson contradicts the previous one (you are more likely to be remembered for your expository work), but there’s a key difference. If you’ve hit diminishing returns with improvements to what’s already inside your circle of competence, it makes sense to experiment with things you already have an aptitude for (or a strong suspicion you might) but you just haven’t made them your focus.

***

Don’t worry about small mistakes

“Once more let me begin with Hilbert. When the Germans were planning to publish Hilbert’s collected papers and to present him with a set on the occasion of one of his later birthdays, they realized that they could not publish the papers in their original versions because they were full of errors, some of them quite serious. Thereupon they hired a young unemployed mathematician, Olga Taussky-Todd, to go over Hilbert’s papers and correct all mistakes. Olga labored for three years; it turned out that all mistakes could be corrected without any major changes in the statement of the theorems. . . . At last, on Hilbert’s birthday, a freshly printed set of Hilbert’s collected papers was presented to the Geheimrat. Hilbert leafed through them carefully and did not notice anything.”

Rota goes on to say: “There are two kinds of mistakes. There are fatal mistakes that destroy a theory; but there are also contingent ones, which are useful in testing the stability of a theory.

Mistakes are either contingent or fatal. Contingent mistakes don’t completely ruin what you’re working on; fatal ones do. Building in a margin of safety (such as having a bit more time or funding that you expect to need) turns many fatal mistakes into contingent ones.

Contingent mistakes can even be useful. When details change, but the underlying theory is still sound, you know which details not to sweat.

***

Use Feynman’s method for solving problems

“Richard Feynman was fond of giving the following advice on how to be a genius. You have to keep a dozen of your favorite problems constantly present in your mind, although by and large they will lay in a dormant state. Every time you hear or read a new trick or a new result, test it against each of your twelve problems to see whether it helps. Every once in a while there will be a hit, and people will say: ‘How did he do it? He must be a genius!’”

***

Write informative introductions

“Nowadays, reading a mathematics paper from top to bottom is a rare event. If we wish our paper to be read, we had better provide our prospective readers with strong motivation to do so. A lengthy introduction, summarizing the history of the subject, giving everybody his due, and perhaps enticingly outlining the content of the paper in a discursive manner, will go some of the way towards getting us a couple of readers.”

As with the lesson of don’t run over time, respect that people have limited time and attention. Introductions are all about explaining what a piece of work is going to be about, what its purpose is, and why someone should be interested in it.

A job posting is an introduction to a company. The description on a calendar invite to a meeting is an introduction to that meeting. An about page is an introduction to an author. The subject line on a cold email is an introduction to that message. A course curriculum is an introduction to a class.

Putting extra effort into our introductions will help other people make an accurate assessment of whether they want to engage with the full thing. It will prime their minds for what to expect and answer some of their questions.

***

If you’re interested in learning more, check out Rota’s “10 Lessons of an MIT Education.

The Best-Case Outcomes Are Statistical Outliers

There’s nothing wrong with hoping for the best. But the best-case scenario is rarely the one that comes to pass. Being realistic about what is likely to happen positions you for a range of possible outcomes and gives you peace of mind.

We dream about achieving the best-case outcomes, but they are rare. We can’t forget to acknowledge all the other possibilities of what may happen if we want to position ourselves for success.

“Hoping for the best, prepared for the worst, and unsurprised by anything in between.” —Maya Angelou

It’s okay to hope for the best—to look at whatever situation you’re in and say, “This time I have it figured out. This time it’s going to work.” First, having some degree of optimism is necessary for trying anything new. If we weren’t overconfident, we’d never have the guts to do something as risky and unlikely to succeed as starting a business, entering a new relationship, or sending that cold email. Anticipating that a new venture will work helps you overcome obstacles and make it work.

Second, sometimes we do have it figured out. Sometimes our solutions do make things better.

Even when the best-case scenario comes to pass, however, it rarely unfolds exactly as planned. Some choices create unanticipated consequences that we have to deal with. We may encounter unexpected roadblocks due to a lack of information. Or the full implementation of all our ideas and aspirations might take a lot longer than we planned for.

When you look back over history, we rarely find best-case outcomes.

Sure, sometimes they happen—maybe more than we think, given not every moment of the past is recorded. But let’s be honest: even historical wins, like developing the polio vaccine and figuring out how to produce clean drinking water, were not all smooth sailing. There are still people who are unable or unwilling to get the polio vaccine. And there are still many people in the world, even in developed countries like Canada, who don’t have access to clean drinking water.

The best-case outcomes in these situations—a world without polio and a world with globally available clean drinking water—have not happened, despite the existence of reliable, proven technology that can make these outcomes a reality.

There are a lot of reasons why, in these situations, we haven’t achieved the best-case outcomes. Furthermore, situations like these are not unusual. We rarely achieve the dream. The more complicated a situation, the more people it involves, the more variables and dependencies that exist, the more it’s unlikely that it’s all going to work out.

If we narrow our scope and say, for example, the best-case scenario for this Friday night is that we don’t burn the pizza, we can all agree on a movie, and the power doesn’t go out, it’s more likely we’ll achieve it. There are fewer variables, so there’s a greater chance that this specific scenario will come to pass.

The problem is that most of us plan as if we live in an easy-to-anticipate Friday night kind of world. We don’t.

There are no magic bullets for the complicated challenges facing society. There is only hard work, planning for the wide spectrum of human behavior, adjusting to changing conditions, and perseverance. There are many possible outcomes for any given endeavor and only one that we consider the best case.

That is why the best-case outcomes are statistical outliers—they are only one possibility in a sea of many. They might come to pass, but you’re much better off preparing for the likelihood that they won’t.

Our expectations matter. Anticipating a range of outcomes can make us feel better. If we expect the best and it happens, we’re merely satisfied. If we expect less and something better happens, we’re delighted.

Knowing that the future is probably not going to be all sunshine and roses allows you to prepare for a variety of more likely outcomes, including some of the bad ones. Sometimes, too, when the worst-case scenario happens, it’s actually a huge relief. We realize it’s not all bad, we didn’t die, and we can manage if it happens again. Preparation and knowing you can handle a wide spectrum of possible challenges is how you get the peace of mind to be unsurprised by anything in between the worst and the best.