Category: Decision Making

The Precautionary Principle: Better Safe than Sorry?

Also known as the Precautionary Approach or Precautionary Action, the Precautionary Principle is a concept best summed up by the proverb “better safe than sorry” or the medical maxim to “first do no harm.”

While there is no single definition, it typically refers to acting to prevent harm by not doing anything that could have negative consequences, even if the possibility of those consequences is uncertain.

In this article, we will explore how the Precautionary Principle works, its strengths and drawbacks, the best way to use it, and how we can apply it in our own lives.

Guilty until proven innocent

Whenever we make even the smallest change within a complex system, we risk dramatic unintended consequences.

The interconnections and dependencies within systems make it almost impossible to predict outcomes—and seeing as they often require a reasonably precise set of conditions to function, our interventions can wreak havoc.

The Precautionary Principle reflects the reality of working with and within complex systems. It shifts the burden of proof from proving something was dangerous after the fact to proving it is safe before taking chances. It emphasizes waiting for more complete information before risking causing damage, especially if some of the possible impacts would be irreversible, hard to contain, or would affect people who didn’t choose to be involved.

The possibility of harm does not need to be specific to that particular circumstance; sometimes we can judge a category of actions as one that always requires precaution because we know it has a high risk of unintended consequences.

For example, invasive species (plants or animals that cause harm after being introduced into a new environment by humans) have repeatedly caused native species to become extinct. So it’s reasonable to exercise precaution and not introduce living things into new places without strong evidence it will be harmless.

Preventing risks and protecting resources

Best known for its use as a regulatory guideline in environmental law and public health, the Precautionary Principle originated with the German term “Vorsorgeprinzip” applied to regulations for preventing air pollution. Konrad Von Moltke, director of the Institute for European Environmental Policy, later translated it into English.

Seeing as the natural world is a highly complex system we have repeatedly disrupted in serious, permanent ways, the Precautionary Principle has become a guiding part of environmental policy in many countries.

For example, the Umweltbundesamt (German Environmental Protection Agency) explains that the Precautionary Principle has two core components in German environmental law today: preventing risks and protecting resources.

Preventing risks means legislators shouldn’t take actions where our knowledge of the potential for environmental damage is incomplete or uncertain but there is cause for concern. The burden of proof is on proving lack of harm, not on proving harm. Protecting resources means preserving things like water and soil in a form future generations can use.

To give another example, some countries evoke versions of the Precautionary Principle to justify bans on genetically modified foods—in some cases for good, in others until evidence of their safety is considered stronger. It is left to legislators to interpret and apply the Precautionary Principle within specific situations.

The flexibility of the Precautionary Principle is both a source of strength and a source of weakness. We live in a fast-moving world where regulation does not always keep up with innovation, meaning guidelines (as opposed to rules) can often prove useful.

Another reason the Precautionary Principle can be a practical addition to legislation is that science doesn’t necessarily move fast enough to protect us from potential risks, especially ones that shift harm elsewhere or take a long time to show up. For example, thousands of human-made substances are present in the food we eat, ranging from medications given to livestock to materials used in packaging. Proving that a new additive has health risks once it’s in the food supply could take decades because it’s incredibly difficult to isolate causative factors. So some regulators, including the Food and Drug Administration in America, require manufacturers to prove something is safe before it goes to market. This approach isn’t perfect, but it’s far safer than waiting to discover harm after we start eating something.

The Precautionary Principle forces us to ask a lot of difficult questions about the nature of risk, uncertainty, probability, the role of government, and ethics. It can also prompt us to question our intuitions surrounding the right decisions to make in certain situations.

When and how to use the Precautionary Principle

When handling risks, it is important to be aware of what we don’t or can’t know for sure. The Precautionary Principle is not intended to be a stifling justification for banning things—it’s a tool for handling particular kinds of uncertainty. Heuristics can guide us in making important decisions, but we still need to be flexible and treat each case as unique.

So how should we use the Precautionary Principle? Sven Ove Hansson suggests two requirements in How Extreme Is the Precautionary Principle? First, if there are competing priorities (beyond avoidance of harm), it should be combined with other decision-making principles. For example, the idea of “explore versus exploit” teaches us that we need to balance doubling down on existing options with trying out new ones. Second, the decision to take precautionary action should be based on the most up-to-date science, and there should be plans in place for how to update that decision if the science changes. That includes planning how often to revaluate the evidence and how to assess its quality.

When is it a good idea to use the Precautionary Principle? There are a few types of situations where it’s better to be safe rather than sorry if things are uncertain.

When the costs of waiting are low. As we’ve already seen, the Precautionary Principle is intended as a tool for handling uncertainty, rather than a justification for arbitrary bans. This means that if the safety of something is uncertain but the costs of waiting to learn more are low, it’s a good idea to use precaution.

When preserving optionality is a priority. The Precautionary Principle is most often evoked for potential risks that would cause irreversible, far-reaching, uncontainable harm. Seeing as we don’t know what the future holds, keeping our options open by avoiding limiting choices gives us the most flexibility later on. The Precautionary Principle preserves optionality by ensuring we don’t restrict the resources we have available further down the line or leave messes for our future selves to clean up.

When the potential costs of a risk are far greater than the cost of preventative action. If a potential risk would be devastating or even ruinous, and it’s possible to protect against it, precautionary action is key. Sometimes winning is just staying in the game—and sometimes staying in the game boils down to not letting anything wipe you out.

For example, in 1963 the Swiss government pledged to provide bunker spaces to all citizens in the event of a nuclear attack or disaster. The country still maintains a national system of thousands of warning sirens and distributes potassium iodide tablets (used to reduce the effects of radiation) to people living near nuclear plants in case of an accident. Given the potential effects of an incident on Switzerland (regardless of how likely it is), these precautionary actions are considered worthwhile.

When alternatives are available. If there are alternative courses of action we know to be safe, it’s a good idea to wait for more information before adopting a new risky one.

When not to use the Precautionary Principle

As the third criteria for using the Precautionary Principle usefully, Sven Ove Hansson recommends it not be used when the likelihood or scale of a potential risk is too low for precautionary action to have any benefit. For example, if one person per year dies from an allergic reaction to a guinea pig bite, it’s probably not worth banning pet guinea pigs. We can add a few more examples of situations where it’s generally not a good idea to use the Precautionary Principle.

When the tradeoffs are substantial and known. The whole point of the Precautionary Principle is to avoid harm. If we know for sure that not taking an action will cause more damage than taking it possibly could, it’s not a good idea to use precaution.

For example, following a 2011 accident at Fukushima, Japan shut down all nuclear power plants. Seeing as nuclear power is cheaper than fossil fuels, this resulted in a sharp increase in electricity prices in parts of the country. According to the authors of the paper Be Cautious with the Precautionary Principle, the resulting increase in mortality from people being unable to spend as much on heating was higher than the fatalities from the actual accident.

When the risks are known and priced in. We all have different levels of risk appetite and we make judgments about whether certain activities are worth the risks involved. When a risk is priced in, that means people are aware of it and voluntarily decide it is worthwhile—or even desirable.

For example, riskier investments tend to have higher potential returns. Although they might not make sense for someone who doesn’t want to risk losing any money, they do make sense for those who consider the potential gains worth the potential losses.

When only a zero-risk option would be satisfying. It’s impossible to completely avoid risks, so it doesn’t make much sense to exercise precaution with the expectation that a 100% safe option will appear.

When taking risks could strengthen us. As individuals, we can sometimes be overly risk averse and too cautious—to the point where it makes us fragile. Our ancestors had the best chance of surviving if they overreacted, rather than underreacted, to risks. But for many of us today, the biggest risk we face can be the stress caused by worrying too much about improbable dangers. We can end up fearing the kinds of risks, like social rejection, that are unavoidable and that tend to make us stronger if we embrace them as inevitable. Never taking any risks is generally a far worse idea than taking sensible ones.

***

We all face decisions every day that involve balancing risk. The Precautionary Principle is a tool that helps us determine when a particular choice is worth taking a gamble on, or when we need to sit tight and collect more information.

The Availability Bias: How to Overcome a Common Cognitive Distortion

“The attention which we lend to an experience is proportional to its vivid or interesting character, and it is a notorious fact that what interests us most vividly at the time is, other things equal, what we remember best.” —William James

The availability heuristic explains why winning an award makes you more likely to win another award. It explains why we sometimes avoid one thing out of fear and end up doing something else that’s objectively riskier. It explains why governments spend enormous amounts of money mitigating risks we’ve already faced. It explains why the five people closest to you have a big impact on your worldview. It explains why mountains of data indicating something is harmful don’t necessarily convince everyone to avoid it. It explains why it can seem as if everything is going well when the stock market is up. And it explains why bad publicity can still be beneficial in the long run.

Here’s how the availability heuristic works, how to overcome it, and how to use it to your advantage.

***

How the availability heuristic works

Before we explain the availability heuristic, let’s quickly recap the field it comes from.

Behavioral economics is a field of study bringing together knowledge from psychology and economics to reveal how real people behave in the real world. This is in contrast to the traditional economic view of human behavior, which assumed people always behave in accordance with rational, stable interests. The field largely began in the 1960s and 1970s with the work of psychologists Amos Tversky and Daniel Kahneman.

Behavioral economics posits that people often make decisions and judgments under uncertainty using imperfect heuristics, rather than by weighing up all of the relevant factors. Quick heuristics enable us to make rapid decisions without taking the time and mental energy to think through all the details.

Most of the time, they lead to satisfactory outcomes. However, they can bias us towards certain consistently irrational decisions that contradict what economics would tell us is the best choice. We usually don’t realize we’re using heuristics, and they’re hard to change even if we’re actively trying to be more rational.

One such cognitive shortcut is the availability heuristic, first studied by Tversky and Kahneman in 1973. We tend to judge the likelihood and significance of things based on how easily they come to mind. The more “available” a piece of information is to us, the more important it seems. The result is that we give greater weight to information we learned recently because a news article you read last night comes to mind easier than a science class you took years ago. It’s too much work to try to comb through every piece of information that might be in our heads.

We also give greater weight to information that is shocking or unusual. Shark attacks and plane crashes strike us more than an accidental drowning or car accidents, so we overestimate their odds.

If we’re presented with a set of similar things with one that differs from the rest, we’ll find it easier to remember. For example, of the sequence of characters “RTASDT9RTGS,” the most common character remembered would be the “9” because it stands out from the letters.

In Behavioural Law and Economics, Timur Kuran and Cass Sunstein write:

“Additional examples from recent years include mass outcries over Agent Orange, asbestos in schools, breast implants, and automobile airbags that endanger children. Their common thread is that people tended to form their risk judgments largely, if not entirely, on the basis of information produced through a social process, rather than personal experience or investigation. In each case, a public upheaval occurred as vast numbers of players reacted to each other’s actions and statements. In each, moreover, the demand for swift, extensive, and costly government action came to be considered morally necessary and socially desirable—even though, in most or all cases, the resulting regulations may well have produced little good, and perhaps even relatively more harm.”

Narratives are more memorable than disjointed facts. There’s a reason why cultures around the world teach important life lessons and values through fables, fairy tales, myths, proverbs, and stories.

Personal experience can also make information more salient. If you’ve recently been in a car accident, you may well view car accidents as more common in general than you did before. The base rates haven’t changed; you just have an unpleasant, vivid memory coming to mind whenever you get in a car. We too easily assume that our recollections are representative and true and discount events that are outside of our immediate memory. To give another example, you may be more likely to buy insurance against a natural disaster if you’ve just been impacted by one than you are before it happens.

Anything that makes something easier to remember increases its impact on us. In an early study, Tversky and Kahneman asked subjects whether a random English word is more likely to begin with “K” or have “K” as the third letter. Seeing as it’s typically easier to recall words beginning with a particular letter, people tended to assume the former was more common. The opposite is true.

In Judgment Under Uncertainty: Heuristics and Biases, Tversky and Kahneman write:

“…one may estimate probability by assessing availability, or associative distance. Lifelong experience has taught us that instances of large classes are recalled better and faster than instances of less frequent classes, that likely occurrences are easier to imagine than unlikely ones, and that associative connections are strengthened when two events frequently co-occur.

…For example, one may assess the divorce rate in a given community by recalling divorces among one’s acquaintances; one may evaluate the probability that a politician will lose an election by considering various ways in which he may lose support; and one may estimate the probability that a violent person will ‘see’ beasts of prey in a Rorschach card by assessing the strength of association between violence and beasts of prey. In all of these cases, the assessment of the frequency of a class or the probability of an event is mediated by an assessment of availability.”

They go on to write:

“That associative bonds are strengthened by repetition is perhaps the oldest law of memory known to man. The availability heuristic exploits the inverse form of this law, that is, it uses strength of association as a basis for the judgment of frequency. In this theory, availability is a mediating variable, rather than a dependent variable as is typically the case in the study of memory.”

***

How the availability heuristic misleads us

“People tend to assess the relative importance of issues by the ease with which they are retrieved from memory—and this is largely determined by the extent of coverage in the media.” —Daniel Kahneman, Thinking Fast and Slow

To go back to the points made in the introduction of this post, winning an award can make you more likely to win another award because it gives you visibility, making your name come to mind more easily in connection to that kind of accolade. We sometimes avoid one thing in favor of something objectively riskier, like driving instead of taking a plane, because the dangers of the latter are more memorable. The five people closest to you can have a big impact on your worldview because you frequently encounter their attitudes and opinions, bringing them to mind when you make your own judgments. Mountains of data indicating something is harmful don’t always convince people to avoid it if those dangers aren’t salient, such as if they haven’t personally experienced them. It can seem as if things are going well when the stock market is up because it’s a simple, visible, and therefore memorable indicator. Bad publicity can be beneficial in the long run if it means something, such as a controversial book, gets mentioned often and is more likely to be recalled.

These aren’t empirical rules, but they’re logical consequences of the availability heuristic, in the absence of mitigating factors.

We are what we remember, and our memories have a significant impact on our perception of the world. What we end up remembering is influenced by factors such as the following:

  • Our foundational beliefs about the world
  • Our expectations
  • The emotions a piece of information inspires in us
  • How many times we’re exposed to a piece of information
  • The source of a piece of information

There is no real link between how memorable something is and how likely it is to happen. In fact, the opposite is often true. Unusual events stand out more and receive more attention than commonplace ones. As a result, the availability heuristic skews our perception of risks in two key ways:

We overestimate the likelihood of unlikely events. And we underestimate the likelihood of likely events.

Overestimating the risk of unlikely events leads us to stay awake at night, turning our hair grey, worrying about things that have almost no chance of happening. We can end up wasting enormous amounts of time, money, and other resources trying to mitigate things that have, on balance, a small impact. Sometimes those mitigation efforts end up backfiring, and sometimes they make us feel safer than they should.

On the flipside, we can overestimate the chance of unusually good things happening to us. Looking at everyone’s highlights on social media, we can end up expecting our own lives to also be a procession of grand achievements and joys. But most people’s lives are mundane most of the time, and the highlights we see tend to be exceptional ones, not routine ones.

Underestimating the risk of likely events leads us to fail to prepare for predictable problems and occurrences. We’re so worn out from worrying about unlikely events, we don’t have the energy to think about what’s in front of us. If you’re stressed and anxious much of the time, you’ll have a hard time paying attention to those signals when they really matter.

All of this is not to say that you shouldn’t prepare for the worst. Or that unlikely things never happen (as Littlewood’s Law states, you can expect a one-in-a-million event at least once per month.) Rather, we should be careful about only preparing for the extremes because those extremes are more memorable.

***

How to overcome the availability heuristic

Knowing about a cognitive bias isn’t usually enough to overcome it. Even people like Kahneman who have studied behavioral economics for many years sometimes struggle with the same irrational patterns. But being aware of the availability heuristic is helpful for the times when you need to make an important decision and can step back to make sure it isn’t distorting your view. Here are five ways of mitigating the availability heuristic.

#1. Always consider base rates when making judgments about probability.
The base rate of something is the average prevalence of it within a particular population. For example, around 10% of the population are left-handed. If you had to guess the likelihood of a random person being left-handed, you would be correct to say 1 in 10 in the absence of other relevant information. When judging the probability of something, look at the base rate whenever possible.

#2. Focus on trends and patterns.
The mental model of regression to the mean teaches us that extreme events tend to be followed by more moderate ones. Outlier events are often the result of luck and randomness. They’re not necessarily instructive. Whenever possible, base your judgments on trends and patterns—the longer term, the better. Track record is everything, even if outlier events are more memorable.

#3. Take the time to think before making a judgment.
The whole point of heuristics is that they save the time and effort needed to parse a ton of information and make a judgment. But, as we always say, you can’t make a good decision without taking time to think. There’s no shortcut for that. If you’re making an important decision, the only way to get around the availability heuristic is to stop and go through the relevant information, rather than assuming whatever comes to mind first is correct.

#4. Keep track of information you might need to use in a judgment far off in the future.
Don’t rely on memory. In Judgment in Managerial Decision-Making, Max Bazerman and Don Moore present the example of workplace annual performance appraisals. Managers tend to base their evaluations more on the prior three months than the nine months before that. It’s much easier than remembering what happened over the course of an entire year. Managers also tend to give substantial weight to unusual one-off behavior, such as a serious mistake or notable success, without considering the overall trend. In this case, noting down observations on someone’s performance throughout the entire year would lead to a more accurate appraisal.

#5. Go back and revisit old information.
Even if you think you can recall everything important, it’s a good idea to go back and refresh your memory of relevant information before making a decision.

The availability heuristic is part of Farnam Street’s latticework of mental models.

Efficiency is the Enemy

There’s a good chance most of the problems in your life and work come down to insufficient slack. Here’s how slack works and why you need more of it.

Imagine if you, as a budding productivity enthusiast, one day gained access to a time machine and decided to take a trip back several decades to the office of one of your old-timey business heroes. Let’s call him Tony.

You disguise yourself as a janitor and figure a few days of observation should be enough to reveal the secret of that CEO’s incredible productivity and shrewd decision-making. You want to learn the habits and methods that enabled him to transform an entire industry for good.

Arriving at the (no doubt smoke-filled) office, you’re a little surprised to find it’s far from a hive of activity. In fact, the people you can see around seem to be doing next to nothing. Outside your hero’s office, his secretary lounges at her desk (and let’s face it, the genders wouldn’t have been the other way around.) Let’s call her Gloria. She doesn’t appear busy at all. You observe for half an hour as she reads, tidies her desk, and chats with other secretaries who pass by. They don’t seem busy either. Confused as to why Tony would squander money on idle staff, you stick around for a few more hours.

With a bit more observation, you realize your initial impression was entirely wrong. Gloria does indeed do nothing much of the time. But every so often, a request, instruction, or alert comes from Tony and she leaps into action. Within minutes, she answers the call, sends the letter, reschedules the appointment, or finds the right document. Any time he has a problem, she solves it right away. There’s no to-do list, no submitting a ticket, no waiting for a reply to an email for either Tony or Gloria.

As a result, Tony’s day goes smoothly and efficiently. Every minute of his time goes on the most important part of his work—making decisions—and not on dealing with trivial inconveniences like waiting in line at the post office.

All that time Gloria spends doing nothing isn’t wasted time. It’s slack: excess capacity allowing for responsiveness and flexibility. The slack time is important because it means she never has a backlog of tasks to complete. She can always deal with anything new straight away. Gloria’s job is to ensure Tony is as busy as he needs to be. It’s not to be as busy as possible.

If you ever find yourself stressed, overwhelmed, sinking into stasis despite wanting to change, or frustrated when you can’t respond to new opportunities, you need more slack in your life.

In Slack: Getting Past Burnout, Busywork, and the Myth of Total Efficiency, Tom DeMarco explains that most people and organizations fail to recognize the value of slack. Although the book is now around twenty years old, its primary message is timeless and worth revisiting.

***

The enemy of efficiency

“You’re efficient when you do something with minimum waste. And you’re effective when you’re doing the right something.”

Many organizations are obsessed with efficiency. They want to be sure every resource is utilized to its fullest capacity and everyone is sprinting around every minute of the day doing something. They hire expert consultants to sniff out the faintest whiff of waste.

As individuals, many of us are also obsessed with the mirage of total efficiency. We schedule every minute of our day, pride ourselves on forgoing breaks, and berate ourselves for the slightest moment of distraction. We view sleep, sickness, and burnout as unwelcome weaknesses and idolize those who never seem to succumb to them. This view, however, fails to recognize that efficiency and effectiveness are not the same thing.

Total efficiency is a myth. Let’s return to Gloria and Tony. Imagine if Tony decided to assign her more work to ensure she spends a full eight hours a day busy. Would that be more efficient? Not really. Slack time enables her to respond to his requests right away, thus being effective at her job. If Gloria is already occupied, Tony will have to wait and whatever he’s doing will get held up. Both of them would be less effective as a result.

Any time we eliminate slack, we create a build-up of work. DeMarco writes, “As a practical matter, it is impossible to keep everyone in the organization 100 percent busy unless we allow for some buffering at each employee’s desk. That means there is an inbox where work stacks up.

Many of us have come to expect work to involve no slack time because of the negative way we perceive it. In a world of manic efficiency, slack often comes across as laziness or a lack of initiative. Without slack time, however, we know we won’t be able to get through new tasks straight away, and if someone insists we should, we have to drop whatever we were previously doing. One way or another, something gets delayed. The increase in busyness may well be futile:

“It’s possible to make an organization more efficient without making it better. That’s what happens when you drive out slack. It’s also possible to make an organization a little less efficient and improve it enormously. In order to do that, you need to reintroduce enough slack to allow the organization to breathe, reinvent itself, and make necessary change.”

***

Defining slack

DeMarco defines slack as “the degree of freedom required to effect change. Slack is the natural enemy of efficiency and efficiency is the natural enemy of slack.” Elsewhere, he writes: “Slack represents operational capacity sacrificed in the interests of long-term health.”

To illustrate the concept, DeMarco asks the reader to imagine one of those puzzle games consisting of eight numbered tiles in a box, with one empty space so you can slide them around one at a time. The objective is to shuffle the tiles into numerical order. That empty space is the equivalent of slack. If you remove it, the game is technically more efficient, but “something else is lost. Without the open space, there is no further possibility of moving tiles at all. The layout is optimal as it is, but if time proves otherwise, there is no way to change it.

Having a little bit of wiggle room allows us to respond to changing circumstances, to experiment, and to do things that might not work.

Slack consists of excess resources. It might be time, money, people on a job, or even expectations. Slack is vital because it prevents us from getting locked into our current state, unable to respond or adapt because we just don’t have the capacity.

Not having slack is taxing. Scarcity weighs on our minds and uses up energy that could go toward doing the task at hand better. It amplifies the impact of failures and unintended consequences.

Too much slack is bad because resources get wasted and people get bored. But, on the whole, an absence of slack is a problem far more often than an excess of it. If you give yourself too much slack time when scheduling a project that goes smoother than expected, you probably won’t spend the spare time sitting like a lemon. Maybe you’ll recuperate from an earlier project that took more effort than anticipated. Maybe you’ll tinker with some on-hold projects. Maybe you’ll be able to review why this one went well and derive lessons for the future. And maybe slack time is just your reward for doing a good job already! You deserve breathing room.

Slack also allows us to handle the inevitable shocks and surprises of life. If every hour in our schedules is accounted for, we can’t slow down to recover from a minor cold, shift a bit of focus to learning a new skill for a while, or absorb a couple of hours of technical difficulties.

In general, you need more slack than you expect. Unless you have a lot of practice, your estimations of how long things will take or how difficult they are will almost always be on the low end. Most of us treat best-case scenarios as if they are the most likely scenarios and will inevitably come to pass, but they rarely do.

You also need to keep a vigilant eye on how fast you use up your slack so you can replenish it in time. For example, you might want to review your calendar once per week to check it still has white space each day and you haven’t allowed meetings to fill up your slack time. Think of the forms of slack that are more important to you, then check up on them regularly. If you find you’re running out of slack, take action.

Once in a while, you might need to forgo slack to reap the benefits of constraints. Lacking slack in the short term or in a particular area can force you to be more inventive. If you find yourself struggling to come up with a creative solution, try consciously reducing your slack. For example, give yourself five-minutes to brainstorm ideas or ask yourself what you might do if your budget were slashed by 90%.

Most of the time, though, it’s critical to guard your slack with care. It’s best to assume you’ll always tend toward using it up—or other people will try to steal it from you. Set clear boundaries in your work and keep an eye on tasks that might inflate.

***

Slack and change

In the past, people and organizations could sometimes get by without much slack—at least for a while. Now, even as slack keeps becoming more and more vital for survival, we’re keener than ever to eliminate it in the name of efficiency. Survival requires constant change and reinvention, which “require a commodity that is absent in our time as it has never been before. That commodity—the catalytic ingredient of change—is slack.” DeMarco goes on to write:

“Slack is the time when reinvention happens. It is time when you are not 100 percent busy doing the operational business of your firm. Slack is the time when you are 0 percent busy. Slack at all levels is necessary to make the organization work effectively and to grow. It is the lubricant of change. Good companies excel in creative use of slack. And bad ones only obsess about removing it.”

Only when we are 0 percent busy can we step back and look at the bigger picture of what we’re doing. Slack allows us to think ahead. To consider whether we’re on the right trajectory. To contemplate unseen problems. To mull over information. To decide if we’re making the right trade-offs. To do things that aren’t scalable or that might not have a chance to prove profitable for a while. To walk away from bad deals.

***

Slack and productivity

The irony is that we achieve far more in the long run when we have slack. We are more productive when we don’t try to be productive all the time.

DeMarco explains that the amount of work each person in an organization has is never static: “Things change on a day-to-day basis. This results in new unevenness of the tasks, with some people incurring additional work (their buffers build up), while others become less loaded, since someone ahead of them in the work chain is slower to generate their particular kind of work to pass along.” An absence of slack is unsustainable. Inevitably, we end up needing additional resources, which have to come from somewhere.

Being comfortable with sometimes being 0 percent busy means we think about whether we’re doing the right thing. This is in contrast to grabbing the first task we see so no one thinks we’re lazy. The expectation of “constant busyness means efficiency” creates pressure to always look occupied and keep a buffer of work on hand. If we see our buffer shrinking and we want to keep busy, the only possible solution is to work slower.

Trying to eliminate slack causes work to expand. There’s never any free time because we always fill it.

Amos Tversky said the secret to doing good research is to always be a little underemployed; you waste years by not being able to waste hours. Those wasted hours are necessary to figure out if you’re headed in the right direction.

The OODA Loop: How Fighter Pilots Make Fast and Accurate Decisions

The OODA Loop is a four-step process for making effective decisions in high-stakes situations. It involves collecting relevant information, recognizing potential biases, deciding, and acting, then repeating the process with new information. Read on to learn how to use the OODA Loop.

When we want to learn how to make rational decisions under pressure, it can be helpful to look at the techniques people use in extreme situations. If they work in the most drastic scenarios, they have a good chance of being effective in more typical ones.

Because they’re developed and tested in the relentless laboratory of conflict, military mental models have practical applications far beyond their original context. If they didn’t work, they would be quickly replaced by alternatives. Military leaders and strategists invest a great deal of time and resources into developing decision-making processes.

One such military mental model is the OODA Loop. Developed by strategist and U.S. Air Force Colonel John Boyd, the OODA Loop is a practical concept designed to function as the foundation of rational thinking in confusing or chaotic situations. “OODA” stands for “Observe, Orient, Decide, and Act.”

What is strategy? A mental tapestry of changing intentions for harmonizing and focusing our efforts as a basis for realizing some aim or purpose in an unfolding and often unforeseen world of many bewildering events and many contending interests.” —John Boyd

***

The four parts of the OODA Loop

Let’s break down the four parts of the OODA Loop and see how they fit together.

Don’t forget the “Loop” part. The process is intended to be repeated again and again until a conflict finishes. Each repetition provides more information to inform the next one, making it a feedback loop.

1: Observe

Step one is to observe the situation with the aim of building the most accurate and comprehensive picture of it possible.

For example, a fighter pilot might consider the following factors in a broad, fluid way:

  • What is immediately affecting me?
  • What is affecting my opponent?
  • What could affect either of us later on?
  • Can I make any predictions?
  • How accurate were my prior predictions?

Information alone is insufficient. The observation stage requires converting information into an overall picture with overarching meaning that places it in context. A particularly vital skill is the capacity to identify which information is just noise and irrelevant for the current decision.

If you want to make good decisions, you need to master the art of observing your environment. For a fighter pilot, that involves factors like the weather conditions and what their opponent is doing. In your workplace, that might include factors like regulations, available resources, relationships with other people, and your current state of mind.

To give an example, consider a doctor meeting with a patient in the emergency room for the first time to identify how to treat them. Their first priority is figuring out what information they need to collect, then collecting it. They might check the patient’s records, ask other staff about the admission, ask the patient questions, check vital signs such as blood pressure, and order particular diagnostic tests. Doctors learn to pick up on subtle cues that can be telling of particular conditions, such as a patient’s speech patterns, body language, what they’ve brought with them to the hospital, and even their smell. In some cases, the absence (rather than presence) of certain cues is also important. At the same time, a doctor needs to discard irrelevant information, then put all the pieces together before they can treat the patient.

2: Orient

Orientation isn’t just a state you’re in; it’s a process. You’re always orienting.” —John Boyd

The second stage of the OODA Loop, orient, is less intuitive than the other steps. However, it’s worth taking the effort to understand it rather than skipping it. Boyd referred to it as the schwerpunkt, meaning “the main emphasis” in German.

To orient yourself is to recognize any barriers that might interfere with the other parts of the OODA Loop.

Orientation means connecting yourself with reality and seeing the world as it really is, as free as possible from the influence of cognitive biases and shortcuts. You can give yourself an edge over the competition by making sure you always orient before making a decision, instead of just jumping in.

Boyd maintained that properly orienting yourself can be enough to overcome an initial disadvantage, such as fewer resources or less information, to outsmart an opponent. He identified the following four main barriers that impede our view of objective information:

  1. Our cultural traditions – we don’t realize how much of what we consider universal behavior is actually culturally prescribed
  2. Our genetic heritage – we all have certain constraints
  3. Our ability to analyze and synthesize – if we haven’t practiced and developed our thinking skills, we tend to fall back on old habits
  4. The influx of new information – it is hard to make sense of observations when the situation keeps changing

Prior to Charlie Munger’s popularization of the concept of building a toolbox of mental models, Boyd advocated a similar approach for pilots to help them better navigate the orient stage of the OODA Loop. He recommended a process of “deductive destruction”: paying attention to your own assumptions and biases, then finding fundamental mental models to replace them.

Similar to using a decision journal, deductive destruction ensures you always learn from past mistakes and don’t keep on repeating them. In one talk, Boyd employed a brilliant metaphor for developing a latticework of mental models. He compared it to building a snowmobile, a vehicle comprising elements of several different devices, such as the caterpillar treads of a tank, skis, the outboard motor of a boat, and the handlebars of a bike.

Individually, each of these items isn’t enough to move you around. But combined they create a functional vehicle. As Boyd put it:

A loser is someone (individual or group) who cannot build snowmobiles when facing uncertainty and unpredictable change; whereas a winner is someone (individual or group) who can build snowmobiles, and employ them in an appropriate fashion, when facing uncertainty and unpredictable change.

To orient yourself, you have to build a metaphorical snowmobile by combining practical concepts from different disciplines. (For more on mental models, we literally wrote the book on them.) Although Boyd is regarded as a military strategist, he didn’t confine himself to any particular discipline. His theories encompass ideas drawn from various disciplines, including mathematical logic, biology, psychology, thermodynamics, game theory, anthropology, and physics. Boyd described his approach as a “scheme of pulling things apart (analysis) and putting them back together (synthesis) in new combinations to find how apparently unrelated ideas and actions can be related to one another.”

3: Decide

There are no surprises here. The previous two steps provide the groundwork you need to make an informed decision. If there are multiple options at hand, you need to use your observation and orientation to select one.

Boyd cautioned against first-conclusion bias, explaining that we cannot keep making the same decision again and again. This part of the loop needs to be flexible and open to Bayesian updating. In some of his notes, Boyd described this step as the hypothesis stage. The implication is that we should test the decisions we make at this point in the loop, spotting their flaws and including any issues in future observation stages

4: Act

There’s a difference between making decisions and enacting decisions. Once you make up your mind, it’s time to take action.

By taking action, you test your decision out. The results will hopefully indicate whether it was a good one or not, providing information for when you cycle back to the first part of the OODA Loop and begin observing anew.

***

Why the OODA Loop works

The ability to operate at a faster tempo or rhythm than an adversary enables one to fold the adversary back inside himself so that he can neither appreciate nor keep up with what is going on. He will become disoriented and confused.” —John Boyd

We’ve identified three key benefits of using the OODA Loop.

1: Deliberate speed

As we’ve established, fighter pilots have to make many decisions in fast succession. They don’t have time to list pros and cons or to consider every available avenue. Once the OODA Loop becomes part of their mental toolboxes, they should be able to cycle through it in a matter of seconds.

Speed is a crucial element of military decision-making. Using the OODA Loop in everyday life, we probably have a little more time than a fighter pilot would. But Boyd emphasized the value of being decisive, taking initiative, and staying autonomous. These are universal assets and apply to many situations.

2: Comfort with uncertainty

There’s no such thing as total certainty. If you’re making a decision at all, it’s because something is uncertain. But uncertainty does not always have to equate to risk.

A fighter pilot is in a precarious situation, one in which where there will be gaps in their knowledge. They cannot read the mind of the opponent and might have incomplete information about the weather conditions and surrounding environment. They can, however, take into account key factors such as the opponent’s type of airplane and what their maneuvers reveal about their intentions and level of training. If the opponent uses an unexpected strategy, is equipped with a new type of weapon or airplane, or behaves in an irrational way, the pilot must accept the accompanying uncertainty. However, Boyd belabored the point that uncertainty is irrelevant if we have the right filters in place.

If we can’t cope with uncertainty, we end up stuck in the observation stage. This sometimes happens when we know we need to make a decision, but we’re scared of getting it wrong. So we keep on reading books and articles, asking people for advice, listening to podcasts, and so on.

Acting under uncertainty is unavoidable. If we do have the right filters, we can factor uncertainty into the observation stage. We can leave a margin of error. We can recognize the elements that are within our control and those that are not.

In presentations, Boyd referred to three key principles to support his ideas: Gödel’s theorems, Heisenberg’s Uncertainty Principle, and the Second Law of Thermodynamics. Of course, we’re using these principles in a different way from their initial purpose and in a simplified, non-literal form.

Gödel’s theorems indicate any mental model we have of reality will omit certain information and that Bayesian updating must be used to bring it in line with reality. For fighter pilots, their understanding of what is going on during a battle will always have gaps. Identifying this fundamental uncertainty gives it less power over us.

The second concept Boyd referred to is Heisenberg’s Uncertainty Principle. In its simplest form, this principle describes the limit of the precision with which pairs of physical properties can be understood. We cannot know the position and the velocity of a body at the same time. We can know either its location or its speed, but not both.

Boyd moved the concept of the Uncertainty Principle from particles to planes. If a pilot focuses too hard on where an enemy plane is, they will lose track of where it is going and vice versa. Trying harder to track the two variables will actually lead to more inaccuracy!

Finally, Boyd made use of the Second Law of Thermodynamics. In a closed system, entropy always increases and everything moves towards chaos. Energy spreads out and becomes disorganized.

Although Boyd’s notes do not specify the exact applications, his inference appears to be that a fighter pilot must be an open system or they will fail. They must draw “energy” (information) from outside themselves or the situation will become chaotic. They should also aim to cut their opponent off, forcing them to become a closed system.

3: Unpredictability

When you act fast enough, other people view you as unpredictable. They can’t figure out the logic behind your decisions.

Boyd recommended making unpredictable changes in speed and direction, writing, “We should operate at a faster tempo than our adversaries or inside our adversaries[’] time scales.…Such activity will make us appear ambiguous (non predictable) [and] thereby generate confusion and disorder among our adversaries.” He even helped design planes that were better equipped to make those unpredictable changes.

For the same reason that you can’t run the same play seventy times in a football game, rigid military strategies often become useless after a few uses, or even one iteration, as opponents learn to recognize and counter them. The OODA Loop can be endlessly used because it is a formless strategy, unconnected to any particular maneuvers.

We know that Boyd was influenced by Sun Tzu (he owned seven thoroughly annotated copies of The Art of War) and drew many ideas from the ancient strategist. Sun Tzu depicts war as a game of deception where the best strategy is that which an opponent cannot preempt.

***

Forty Second Boyd

Let your plans be dark and impenetrable as night, and when you move, fall like a thunderbolt.” —Sun Tzu

Boyd was no armchair strategist. He developed his ideas through extensive experience as a fighter pilot. His nickname “Forty Second Boyd” speaks to his expertise: Boyd could win any aerial battle in less than forty seconds.

In a tribute written after Boyd’s death, General C.C. Krulak described him as “a towering intellect who made unsurpassed contributions to the American art of war. Indeed, he was one of the central architects of the reform of military thought.…From John Boyd we learned about competitive decision-making on the battlefield—compressing time, using time as an ally.

Reflecting Robert Greene’s maxim that everything is material, Boyd spent his career observing people and organizations. How do they adapt to changeable environments in conflicts, business, and other situations?

Over time, he deduced that these situations are characterized by uncertainty. Dogmatic, rigid theories are unsuitable for chaotic situations. Rather than trying to rise through the military ranks, Boyd focused on using his position as a colonel to compose a theory of the universal logic of war.

Boyd was known to ask his mentees the poignant question, “Do you want to be someone, or do you want to do something?” In his own life, he certainly focused on the latter path and, as a result, left us ideas with tangible value. The OODA Loop is just one of many.

Boyd developed the OODA Loop with fighter pilots in mind, but like all good mental models, it works in other fields beyond combat. It’s used in intelligence agencies. It’s used by lawyers, doctors, businesspeople, politicians, law enforcement, marketers, athletes, coaches, and more.

If you have to work fast, you might want to learn a thing or two from fighter pilots. For them, a split-second of hesitation can cost them their lives. As anyone who has ever watched Top Gun knows, pilots have a lot of decisions and processes to juggle when they’re in dogfights (close-range aerial battles). Pilots move at high speeds and need to avoid enemies while tracking them and keeping a contextual knowledge of objectives, terrains, fuel, and other key variables.

And as any pilot who has been in one will tell you, dogfights are nasty. No one wants them to last longer than necessary because every second increases the risk of something going wrong. Pilots have to rely on their decision-making skills—they can’t just follow a schedule or to-do list to know what to do.

***

Applying the OODA Loop

We can’t just look at our own personal experiences or use the same mental recipes over and over again; we’ve got to look at other disciplines and activities and relate or connect them to what we know from our experiences and the strategic world we live in.” —John Boyd

In sports, there is an adage that carries over to business quite well: “Speed kills.” If you are able to be nimble, assess the ever-changing environment, and adapt quickly, you’ll always carry the advantage over any opponents.

Start applying the OODA Loop to your day-to-day decisions and watch what happens. You’ll start to notice things that you would have been oblivious to before. Before jumping to your first conclusion, you’ll pause to consider your biases, take in additional information, and be more thoughtful of consequences.

As with anything you practice, if you do it right, the more you do it, the better you’ll get. You’ll start making better decisions to your full potential. You’ll see more rapid progress. And as John Boyd would prescribe, you’ll start to do something in your life, and not just be somebody.

***

We hope you’ve enjoyed our three week exploration of perspectives on decision making. We think there is value in juxtaposing different ideas to help us learn. Stay tuned for more topic specific series in the future.

Avoiding Bad Decisions

Sometimes success is just about avoiding failure.

At FS, we help people make better decisions without needing to rely on getting lucky. One aspect of decision-making that’s rarely talked about is how to avoid making bad decisions.

Here are five of the biggest reasons we make bad decisions.

***

1. We’re unintentionally stupid

We like to think that we can rationally process information like a computer, but we can’t. Cognitive biases explain why we made a bad decision but rarely help us avoid them in the first place. It’s better to focus on these warning signs that signal something is about to go wrong.

Warning signs you’re about to unintentionally do something stupid:

  • You’re tired, emotional, in a rush, or distracted.
  • You’re operating in a group or working with an authority figure.

The rule: Never make important decisions when you’re tired, emotional, distracted, or in a rush.

2. We solve the wrong problem

The first person to state the problem rarely has the best insight into the problem. Once a problem is thrown out on the table, however, our type-A problem-solving nature kicks in and forgets to first ask if we’re solving the right problem.

Warning signs you’re solving the wrong problem:

  • You let someone else define the problem for you.
  • You’re far away from the problem.
  • You’re thinking about the problem at only one level or through a narrow lens.

The rule: Never let anyone define the problem for you.

3. We use incorrect or insufficient information

We like to believe that people tell us the truth. We like to believe the people we talk to understand what they are talking about. We like to believe that we have all the information.

Warning signs you have incorrect or insufficient information:

  • You’re speaking to someone who spoke to someone who spoke to someone. Someone will get in trouble when the truth comes out.
  • You’re reading about it in the news.

The rule: Seek out information from someone as close to the source as possible, because they’ve earned their knowledge and have an understanding that you don’t. When information is filtered (and it often is), first consider the incentives involved and then think of the proximity to earned knowledge.

4. We fail to learn

You know the person that sits beside you at work that has twenty years of experience but keeps making the same mistakes over and over? They don’t have twenty years of experience—they have one year of experience repeated twenty times. If you can’t learn, you can’t get better.

Most of us can observe and react accordingly. But to truly learn from our experiences, we must reflect on our reactions. Reflection has to be part of your process, not something you might do if you have time. Don’t use the excuse of being too busy or get too invested in protecting your ego. In short, we can’t learn from experience without reflection. Only reflection allows us to distill experience into something we can learn from to make better decisions in the future.

Warning signs you’re not learning:

  • You’re too busy to reflect.
  • You don’t keep track of your decisions.
  • You can’t calibrate your own decision-making.

The rule: Be less busy. Keep a learning journal. Reflect every day.

5. We focus on optics over outcomes

Our evolutionary programming conditions us to do what’s easy over what’s right. After all, it’s often easier to signal being virtuous than to actually be virtuous.

Warning signs you’re focused on optics:

  • You’re thinking about how you’ll defend your decision.
  • You’re knowingly choosing what’s defendable over what’s right.
  • You’d make a different decision if you owned the company.
  • You catch yourself saying this is what your boss would want.

The rule: Act as you would want an employee to act if you owned the company.

***

Avoiding bad decisions is just as important as making good ones. Knowing the warning signs and having a set of rules for your decision-making process limits the amount of luck you need to get good outcomes.

Your Thinking Rate Is Fixed

You can’t force yourself to think faster. If you try, you’re likely to end up making much worse decisions. Here’s how to improve the actual quality of your decisions instead of chasing hacks to speed them up.

If you’re a knowledge worker, as an ever-growing proportion of people are, the product of your job is decisions.

Much of what you do day to day consists of trying to make the right choices among competing options, meaning you have to process large amounts of information, discern what’s likely to be most effective for moving towards your desired goal, and try to anticipate potential problems further down the line. And all the while, you’re operating in an environment of uncertainty where anything could happen tomorrow.

When the product of your job is your decisions, you might find yourself wanting to be able to make more decisions more quickly so you can be more productive overall.

Chasing speed is a flawed approach. Because decisions—at least good ones—don’t come out of thin air. They’re supported by a lot of thinking.

While experience and education can grant you the pattern-matching abilities to make some kinds of decisions using intuition, you’re still going to run into decisions that require you to sit and consider the problem from multiple angles. You’re still going to need to schedule time to do nothing but think. Otherwise making more decisions will make you less productive overall, not more, because your decisions will suck.

Here’s a secret that might sound obvious but can actually transform the way you work: you can’t force yourself to think faster. Our brains just don’t work that way. The rate at which you make mental discernments is fixed.

Sure, you can develop your ability to do certain kinds of thinking faster over time. You can learn new methods for decision-making. You can develop your mental models. You can build your ability to focus. But if you’re trying to speed up your thinking so you can make an extra few decisions today, forget it.

***

Beyond the “hurry up” culture

Management consultant Tom DeMarco writes in Slack: Getting Past Burnout, Busywork, and the Myth of Total Efficiency that many knowledge work organizations have a culture where the dominant message at all times is to hurry up.

Everyone is trying to work faster at all times, and they pressure everyone around them to work faster, too. No one wants to be perceived as a slacker. The result is that managers put pressure on their subordinates through a range of methods. DeMarco lists the following examples:

  • “Turning the screws on delivery dates (aggressive scheduling)
  • Loading on extra work
  • Encouraging overtime
  • Getting angry when disappointed
  • Noting one subordinate’s extraordinary effort and praising it in the presence of others
  • Being severe about anything other than superb performance
  • Expecting great things of all your workers
  • Railing against any apparent waste of time
  • Setting an example yourself (with the boss laboring so mightily there is certainly no time for anyone else to goof off)
  • Creating incentives to encourage desired behavior or results.”

All of these things increase pressure in the work environment and repeatedly reinforce the “hurry up!” message. They make managers feel like they’re moving things along faster. That way if work isn’t getting done, it’s not their fault. But, DeMarco writes, they don’t lead to meaningful changes in behavior that make the whole organization more productive. Speeding up often results in poor decisions that create future problems.

The reason more pressure doesn’t mean better productivity is that the rate at which we think is fixed.

We can’t force ourselves to start making faster decisions right now just because we’re faced with an unrealistic deadline. DeMarco writes, “Think rate is fixed. No matter what you do, no matter how hard you try, you can’t pick up the pace of thinking.

If you’re doing a form of physical labor, you can move your body faster when under pressure. (Of course, if it’s too fast, you’ll get injured or won’t be able to sustain it for long.)

If you’re a knowledge worker, you can’t pick up the pace of mental discriminations just because you’re under pressure. Chances are good that you’re already going as fast as you can. Because guess what? You can’t voluntarily slow down your thinking, either.

***

The limits of pressure

Faced with added stress and unable to accelerate our brains instantaneously, we can do any of three things:

  • “Eliminate wasted time.
  • Defer tasks that are not on the critical path.
  • Stay late.”

Even if those might seem like positive things, they’re less advantageous than they appear at first glance. Their effects are marginal at best. The smarter and more qualified the knowledge worker, the less time they’re likely to be wasting anyway. Most people don’t enjoy wasting time. What you’re more likely to end up eliminating is valuable slack time for thinking.

Deferring non-critical tasks doesn’t save any time overall, it just pushes work forwards—to the point where those tasks do become critical. Then something else gets deferred.

Staying late might work once in a while. Again, though, its effects are limited. If we keep doing it night after night, we run out of energy, our personal lives suffer, and we make worse decisions as a result.

None of the outcomes of increasing pressure result in more or better decisions. None of them speed up the rate at which people think. Even if an occasional, tactical increase in pressure (whether it comes from the outside or we choose to apply it to ourselves) can be effective, ongoing pressure increases are unsustainable in the long run.

***

Think rate is fixed

It’s incredibly important to truly understand the point DeMarco makes in this part of Slack: the rate at which we process information is fixed.

When you’re under pressure, the quality of your decisions plummets. You miss possible angles, you don’t think ahead, you do what makes sense now, you panic, and so on. Often, you make a snap judgment then grasp for whatever information will support it for the people you work with. You don’t have breathing room to stress-test your decisions.

The clearer you can think, the better your decisions will be. Trying to think faster can only cloud your judgment. It doesn’t matter how many decisions you make if they’re not good ones. As DeMarco reiterates throughout the book, you can be efficient without being effective.

Try making a list of the worst decisions you’ve made so far in your career. There’s a good chance most of them were made under intense pressure or without taking much time over them.

At Farnam Street, we write a lot about how to make better decisions, and we share a lot of tools for better thinking. We made a whole course on decision-making. But none of these resources are meant to immediately accelerate your thinking. Many of them require you to actually slow down a whole lot and spend more time on your decisions. They improve the rate at which you can do certain kinds of thinking, but it’s not going to be an overnight process.

***

Upgrading your brain

Some people read one of our articles or books about mental models and complain that it’s not an effective approach because it didn’t lead to an immediate improvement in their thinking. That’s unsurprising; our brains don’t work like that. Integrating new, better approaches takes a ton of time and repetition, just like developing any other skill. You have to keep on reflecting and making course corrections.

At the end of the day, your brain is going to go where it wants to go. You’re going to think the way you think. However much you build awareness of how the world works and learn how to reorient, you’re still, to use Jonathan Haidt’s metaphor from The Righteous Mind, a tiny rider atop a gigantic elephant. None of us can reshape how we think overnight.

Making good decisions is hard work. There’s a limit to how many decisions you can make in a day before you need a break. On top of that, many knowledge workers are in fields where the most relevant information has a short half-life. Making good decisions requires constant learning and verifying what you think you know.

If you want to make better decisions, you need to do everything you can to reduce the pressure you’re under. You need to let your brain take whatever time it needs to think through the problem at hand. You need to get out of a reactive mode, recognize when you need to pause, and spend more time looking at problems.

A good metaphor is installing an update to the operating system on your laptop. Would you rather install an update that fixes bugs and improves existing processes, or one that just makes everything run faster? Obviously, you’d prefer the former. The latter would just lead to more crashes. The same is true for updating your mental operating system.

Stop trying to think faster. Start trying to think better.