Tag: Decision Making

The OODA Loop: How Fighter Pilots Make Fast and Accurate Decisions

The OODA Loop is a four-step process for making effective decisions in high-stakes situations. It involves collecting relevant information, recognizing potential biases, deciding, and acting, then repeating the process with new information. Read on to learn how to use the OODA Loop.

When we want to learn how to make rational decisions under pressure, it can be helpful to look at the techniques people use in extreme situations. If they work in the most drastic scenarios, they have a good chance of being effective in more typical ones.

Because they’re developed and tested in the relentless laboratory of conflict, military mental models have practical applications far beyond their original context. If they didn’t work, they would be quickly replaced by alternatives. Military leaders and strategists invest a great deal of time and resources into developing decision-making processes.

One such military mental model is the OODA Loop. Developed by strategist and U.S. Air Force Colonel John Boyd, the OODA Loop is a practical concept designed to function as the foundation of rational thinking in confusing or chaotic situations. “OODA” stands for “Observe, Orient, Decide, and Act.”

What is strategy? A mental tapestry of changing intentions for harmonizing and focusing our efforts as a basis for realizing some aim or purpose in an unfolding and often unforeseen world of many bewildering events and many contending interests.” —John Boyd

***

The four parts of the OODA Loop

Let’s break down the four parts of the OODA Loop and see how they fit together.

Don’t forget the “Loop” part. The process is intended to be repeated again and again until a conflict finishes. Each repetition provides more information to inform the next one, making it a feedback loop.

1: Observe

Step one is to observe the situation with the aim of building the most accurate and comprehensive picture of it possible.

For example, a fighter pilot might consider the following factors in a broad, fluid way:

  • What is immediately affecting me?
  • What is affecting my opponent?
  • What could affect either of us later on?
  • Can I make any predictions?
  • How accurate were my prior predictions?

Information alone is insufficient. The observation stage requires converting information into an overall picture with overarching meaning that places it in context. A particularly vital skill is the capacity to identify which information is just noise and irrelevant for the current decision.

If you want to make good decisions, you need to master the art of observing your environment. For a fighter pilot, that involves factors like the weather conditions and what their opponent is doing. In your workplace, that might include factors like regulations, available resources, relationships with other people, and your current state of mind.

To give an example, consider a doctor meeting with a patient in the emergency room for the first time to identify how to treat them. Their first priority is figuring out what information they need to collect, then collecting it. They might check the patient’s records, ask other staff about the admission, ask the patient questions, check vital signs such as blood pressure, and order particular diagnostic tests. Doctors learn to pick up on subtle cues that can be telling of particular conditions, such as a patient’s speech patterns, body language, what they’ve brought with them to the hospital, and even their smell. In some cases, the absence (rather than presence) of certain cues is also important. At the same time, a doctor needs to discard irrelevant information, then put all the pieces together before they can treat the patient.

2: Orient

Orientation isn’t just a state you’re in; it’s a process. You’re always orienting.” —John Boyd

The second stage of the OODA Loop, orient, is less intuitive than the other steps. However, it’s worth taking the effort to understand it rather than skipping it. Boyd referred to it as the schwerpunkt, meaning “the main emphasis” in German.

To orient yourself is to recognize any barriers that might interfere with the other parts of the OODA Loop.

Orientation means connecting yourself with reality and seeing the world as it really is, as free as possible from the influence of cognitive biases and shortcuts. You can give yourself an edge over the competition by making sure you always orient before making a decision, instead of just jumping in.

Boyd maintained that properly orienting yourself can be enough to overcome an initial disadvantage, such as fewer resources or less information, to outsmart an opponent. He identified the following four main barriers that impede our view of objective information:

  1. Our cultural traditions – we don’t realize how much of what we consider universal behavior is actually culturally prescribed
  2. Our genetic heritage – we all have certain constraints
  3. Our ability to analyze and synthesize – if we haven’t practiced and developed our thinking skills, we tend to fall back on old habits
  4. The influx of new information – it is hard to make sense of observations when the situation keeps changing

Prior to Charlie Munger’s popularization of the concept of building a toolbox of mental models, Boyd advocated a similar approach for pilots to help them better navigate the orient stage of the OODA Loop. He recommended a process of “deductive destruction”: paying attention to your own assumptions and biases, then finding fundamental mental models to replace them.

Similar to using a decision journal, deductive destruction ensures you always learn from past mistakes and don’t keep on repeating them. In one talk, Boyd employed a brilliant metaphor for developing a latticework of mental models. He compared it to building a snowmobile, a vehicle comprising elements of several different devices, such as the caterpillar treads of a tank, skis, the outboard motor of a boat, and the handlebars of a bike.

Individually, each of these items isn’t enough to move you around. But combined they create a functional vehicle. As Boyd put it:

A loser is someone (individual or group) who cannot build snowmobiles when facing uncertainty and unpredictable change; whereas a winner is someone (individual or group) who can build snowmobiles, and employ them in an appropriate fashion, when facing uncertainty and unpredictable change.

To orient yourself, you have to build a metaphorical snowmobile by combining practical concepts from different disciplines. (For more on mental models, we literally wrote the book on them.) Although Boyd is regarded as a military strategist, he didn’t confine himself to any particular discipline. His theories encompass ideas drawn from various disciplines, including mathematical logic, biology, psychology, thermodynamics, game theory, anthropology, and physics. Boyd described his approach as a “scheme of pulling things apart (analysis) and putting them back together (synthesis) in new combinations to find how apparently unrelated ideas and actions can be related to one another.”

3: Decide

There are no surprises here. The previous two steps provide the groundwork you need to make an informed decision. If there are multiple options at hand, you need to use your observation and orientation to select one.

Boyd cautioned against first-conclusion bias, explaining that we cannot keep making the same decision again and again. This part of the loop needs to be flexible and open to Bayesian updating. In some of his notes, Boyd described this step as the hypothesis stage. The implication is that we should test the decisions we make at this point in the loop, spotting their flaws and including any issues in future observation stages

4: Act

There’s a difference between making decisions and enacting decisions. Once you make up your mind, it’s time to take action.

By taking action, you test your decision out. The results will hopefully indicate whether it was a good one or not, providing information for when you cycle back to the first part of the OODA Loop and begin observing anew.

***

Why the OODA Loop works

The ability to operate at a faster tempo or rhythm than an adversary enables one to fold the adversary back inside himself so that he can neither appreciate nor keep up with what is going on. He will become disoriented and confused.” —John Boyd

We’ve identified three key benefits of using the OODA Loop.

1: Deliberate speed

As we’ve established, fighter pilots have to make many decisions in fast succession. They don’t have time to list pros and cons or to consider every available avenue. Once the OODA Loop becomes part of their mental toolboxes, they should be able to cycle through it in a matter of seconds.

Speed is a crucial element of military decision-making. Using the OODA Loop in everyday life, we probably have a little more time than a fighter pilot would. But Boyd emphasized the value of being decisive, taking initiative, and staying autonomous. These are universal assets and apply to many situations.

2: Comfort with uncertainty

There’s no such thing as total certainty. If you’re making a decision at all, it’s because something is uncertain. But uncertainty does not always have to equate to risk.

A fighter pilot is in a precarious situation, one in which where there will be gaps in their knowledge. They cannot read the mind of the opponent and might have incomplete information about the weather conditions and surrounding environment. They can, however, take into account key factors such as the opponent’s type of airplane and what their maneuvers reveal about their intentions and level of training. If the opponent uses an unexpected strategy, is equipped with a new type of weapon or airplane, or behaves in an irrational way, the pilot must accept the accompanying uncertainty. However, Boyd belabored the point that uncertainty is irrelevant if we have the right filters in place.

If we can’t cope with uncertainty, we end up stuck in the observation stage. This sometimes happens when we know we need to make a decision, but we’re scared of getting it wrong. So we keep on reading books and articles, asking people for advice, listening to podcasts, and so on.

Acting under uncertainty is unavoidable. If we do have the right filters, we can factor uncertainty into the observation stage. We can leave a margin of error. We can recognize the elements that are within our control and those that are not.

In presentations, Boyd referred to three key principles to support his ideas: Gödel’s theorems, Heisenberg’s Uncertainty Principle, and the Second Law of Thermodynamics. Of course, we’re using these principles in a different way from their initial purpose and in a simplified, non-literal form.

Gödel’s theorems indicate any mental model we have of reality will omit certain information and that Bayesian updating must be used to bring it in line with reality. For fighter pilots, their understanding of what is going on during a battle will always have gaps. Identifying this fundamental uncertainty gives it less power over us.

The second concept Boyd referred to is Heisenberg’s Uncertainty Principle. In its simplest form, this principle describes the limit of the precision with which pairs of physical properties can be understood. We cannot know the position and the velocity of a body at the same time. We can know either its location or its speed, but not both.

Boyd moved the concept of the Uncertainty Principle from particles to planes. If a pilot focuses too hard on where an enemy plane is, they will lose track of where it is going and vice versa. Trying harder to track the two variables will actually lead to more inaccuracy!

Finally, Boyd made use of the Second Law of Thermodynamics. In a closed system, entropy always increases and everything moves towards chaos. Energy spreads out and becomes disorganized.

Although Boyd’s notes do not specify the exact applications, his inference appears to be that a fighter pilot must be an open system or they will fail. They must draw “energy” (information) from outside themselves or the situation will become chaotic. They should also aim to cut their opponent off, forcing them to become a closed system.

3: Unpredictability

When you act fast enough, other people view you as unpredictable. They can’t figure out the logic behind your decisions.

Boyd recommended making unpredictable changes in speed and direction, writing, “We should operate at a faster tempo than our adversaries or inside our adversaries[’] time scales.…Such activity will make us appear ambiguous (non predictable) [and] thereby generate confusion and disorder among our adversaries.” He even helped design planes that were better equipped to make those unpredictable changes.

For the same reason that you can’t run the same play seventy times in a football game, rigid military strategies often become useless after a few uses, or even one iteration, as opponents learn to recognize and counter them. The OODA Loop can be endlessly used because it is a formless strategy, unconnected to any particular maneuvers.

We know that Boyd was influenced by Sun Tzu (he owned seven thoroughly annotated copies of The Art of War) and drew many ideas from the ancient strategist. Sun Tzu depicts war as a game of deception where the best strategy is that which an opponent cannot preempt.

***

Forty Second Boyd

Let your plans be dark and impenetrable as night, and when you move, fall like a thunderbolt.” —Sun Tzu

Boyd was no armchair strategist. He developed his ideas through extensive experience as a fighter pilot. His nickname “Forty Second Boyd” speaks to his expertise: Boyd could win any aerial battle in less than forty seconds.

In a tribute written after Boyd’s death, General C.C. Krulak described him as “a towering intellect who made unsurpassed contributions to the American art of war. Indeed, he was one of the central architects of the reform of military thought.…From John Boyd we learned about competitive decision-making on the battlefield—compressing time, using time as an ally.

Reflecting Robert Greene’s maxim that everything is material, Boyd spent his career observing people and organizations. How do they adapt to changeable environments in conflicts, business, and other situations?

Over time, he deduced that these situations are characterized by uncertainty. Dogmatic, rigid theories are unsuitable for chaotic situations. Rather than trying to rise through the military ranks, Boyd focused on using his position as a colonel to compose a theory of the universal logic of war.

Boyd was known to ask his mentees the poignant question, “Do you want to be someone, or do you want to do something?” In his own life, he certainly focused on the latter path and, as a result, left us ideas with tangible value. The OODA Loop is just one of many.

Boyd developed the OODA Loop with fighter pilots in mind, but like all good mental models, it works in other fields beyond combat. It’s used in intelligence agencies. It’s used by lawyers, doctors, businesspeople, politicians, law enforcement, marketers, athletes, coaches, and more.

If you have to work fast, you might want to learn a thing or two from fighter pilots. For them, a split-second of hesitation can cost them their lives. As anyone who has ever watched Top Gun knows, pilots have a lot of decisions and processes to juggle when they’re in dogfights (close-range aerial battles). Pilots move at high speeds and need to avoid enemies while tracking them and keeping a contextual knowledge of objectives, terrains, fuel, and other key variables.

And as any pilot who has been in one will tell you, dogfights are nasty. No one wants them to last longer than necessary because every second increases the risk of something going wrong. Pilots have to rely on their decision-making skills—they can’t just follow a schedule or to-do list to know what to do.

***

Applying the OODA Loop

We can’t just look at our own personal experiences or use the same mental recipes over and over again; we’ve got to look at other disciplines and activities and relate or connect them to what we know from our experiences and the strategic world we live in.” —John Boyd

In sports, there is an adage that carries over to business quite well: “Speed kills.” If you are able to be nimble, assess the ever-changing environment, and adapt quickly, you’ll always carry the advantage over any opponents.

Start applying the OODA Loop to your day-to-day decisions and watch what happens. You’ll start to notice things that you would have been oblivious to before. Before jumping to your first conclusion, you’ll pause to consider your biases, take in additional information, and be more thoughtful of consequences.

As with anything you practice, if you do it right, the more you do it, the better you’ll get. You’ll start making better decisions to your full potential. You’ll see more rapid progress. And as John Boyd would prescribe, you’ll start to do something in your life, and not just be somebody.

***

We hope you’ve enjoyed our three week exploration of perspectives on decision making. We think there is value in juxtaposing different ideas to help us learn. Stay tuned for more topic specific series in the future.

Avoiding Bad Decisions

Sometimes success is just about avoiding failure.

At FS, we help people make better decisions without needing to rely on getting lucky. One aspect of decision-making that’s rarely talked about is how to avoid making bad decisions.

Here are five of the biggest reasons we make bad decisions.

***

1. We’re unintentionally stupid

We like to think that we can rationally process information like a computer, but we can’t. Cognitive biases explain why we made a bad decision but rarely help us avoid them in the first place. It’s better to focus on these warning signs that signal something is about to go wrong.

Warning signs you’re about to unintentionally do something stupid:

  • You’re tired, emotional, in a rush, or distracted.
  • You’re operating in a group or working with an authority figure.

The rule: Never make important decisions when you’re tired, emotional, distracted, or in a rush.

2. We solve the wrong problem

The first person to state the problem rarely has the best insight into the problem. Once a problem is thrown out on the table, however, our type-A problem-solving nature kicks in and forgets to first ask if we’re solving the right problem.

Warning signs you’re solving the wrong problem:

  • You let someone else define the problem for you.
  • You’re far away from the problem.
  • You’re thinking about the problem at only one level or through a narrow lens.

The rule: Never let anyone define the problem for you.

3. We use incorrect or insufficient information

We like to believe that people tell us the truth. We like to believe the people we talk to understand what they are talking about. We like to believe that we have all the information.

Warning signs you have incorrect or insufficient information:

  • You’re speaking to someone who spoke to someone who spoke to someone. Someone will get in trouble when the truth comes out.
  • You’re reading about it in the news.

The rule: Seek out information from someone as close to the source as possible, because they’ve earned their knowledge and have an understanding that you don’t. When information is filtered (and it often is), first consider the incentives involved and then think of the proximity to earned knowledge.

4. We fail to learn

You know the person that sits beside you at work that has twenty years of experience but keeps making the same mistakes over and over? They don’t have twenty years of experience—they have one year of experience repeated twenty times. If you can’t learn, you can’t get better.

Most of us can observe and react accordingly. But to truly learn from our experiences, we must reflect on our reactions. Reflection has to be part of your process, not something you might do if you have time. Don’t use the excuse of being too busy or get too invested in protecting your ego. In short, we can’t learn from experience without reflection. Only reflection allows us to distill experience into something we can learn from to make better decisions in the future.

Warning signs you’re not learning:

  • You’re too busy to reflect.
  • You don’t keep track of your decisions.
  • You can’t calibrate your own decision-making.

The rule: Be less busy. Keep a learning journal. Reflect every day.

5. We focus on optics over outcomes

Our evolutionary programming conditions us to do what’s easy over what’s right. After all, it’s often easier to signal being virtuous than to actually be virtuous.

Warning signs you’re focused on optics:

  • You’re thinking about how you’ll defend your decision.
  • You’re knowingly choosing what’s defendable over what’s right.
  • You’d make a different decision if you owned the company.
  • You catch yourself saying this is what your boss would want.

The rule: Act as you would want an employee to act if you owned the company.

***

Avoiding bad decisions is just as important as making good ones. Knowing the warning signs and having a set of rules for your decision-making process limits the amount of luck you need to get good outcomes.

Explore Or Exploit? How To Choose New Opportunities

One big challenge we all face in life is knowing when to explore new opportunities, and when to double down on existing ones. Explore vs exploit algorithms – and poetry – teach us that it’s vital to consider how much time we have, how we can best avoid regrets, and what we can learn from failures.

***

“Had we but world enough, and time,
This coyness, Lady, were no crime.
We would sit down and think which way
To walk and pass our long love’s day . . .

Let us roll all our strength and all
Our sweetness up into one ball,
And tear our pleasures with rough strife
Through the iron gates of life:
Thus, though we cannot make our sun
Stand still, yet we will make him run.”
—Andrew Marvell, To His Coy Mistress

Of all the questions life demands we answer, “To explore or to exploit?” is one we have to confront almost every day. Do we keep trying new restaurants? Do we keep learning new ideas? Do we keep making new friends? Or do we enjoy what we’ve come to find and love?

There is no doubt that humans are great at exploring, as most generalist species are. Not content to stay in that cave, hunt that animal, or keep doing it the way our grandmother taught us, humans owe at least part of our success due to our willingness to explore.

But when is what you’ve already explored enough? When can you finally settle down to enjoy the fruits of your exploration? When can you be content to exploit the knowledge you already have?

Turns out that there are algorithms for that.

In Algorithms to Live By, authors Brian Christian and Tom Griffiths devote an entire chapter to how computer algorithms deal with the explore/exploit conundrum and how you can apply those lessons to the same tension in your life.

***

How much time do you have?

One of the most important factors in determining whether to continue exploring or to exploit what you’ve got is time. Christian and Griffiths explain that “seizing a day and seizing a lifetime are two entirely different endeavors. . . . When balancing favorite experiences and new ones, nothing matters as much as the interval over which we plan to enjoy them.

Time intervals can be a construct of your immediate circumstances, like the boundaries provided by a two-week vacation. For a lot of us, the last night in a lovely foreign place will see us eating at the best restaurant we have found so far. Time intervals can also be considered over the arc of your life in general. Children are consummate explorers, but as we grow up, the choice to exploit becomes more of a daily decision. How would your choices today be impacted if you knew you were going to live another five years? Twenty years? Forty years? Christian and Griffiths advise, “Explore when you will have time to use the resulting knowledge, exploit when you’re ready to cash in.”

“I have known days like that, of warm winds drowsing in the heat
of noon and all of summer spinning slowly on its reel,
days briefly lived, that leave long music in the mind
more sweet than truth: I play them and rewind.”
—Russell Hoban, Summer Recorded

Sometimes we are too quick to stop exploring. We have these amazing days and magical experiences, and we want to keep repeating them forever. However, changes in ourselves and the world around us are inevitable, and so committing to a path of exploitation too early leaves us unable to adapt. As much as it can be hard to walk away from that perfect day, Christian and Griffiths explain that “exploration in itself has value, since trying new things increases our chances of finding the best. So taking the future into account, rather than focusing just on the present, drives us toward novelty.

“Like as the waves make towards the pebbled shore,
So do our minutes hasten to their end;
Each changing place with that which goes before,
In sequent toil all forwards do contend.”
—William Shakespeare, Sonnet 60

There is no doubt that for many of us time is our most precious resource. We never seem to have enough, and we want to maximize the value we get from how we choose to use it. So when deciding between whether to enjoy what you have or search for something better, adding time to your decision-making process can help point the way.

***

Minimizing the pain of regret

The threat of regret looms over many explore/exploit considerations. We can regret both not searching for something better and not taking the time to enjoy what we already have. The problem with regret is that we don’t have it in advance of a poor decision. Sometimes, second-order thinking can be used as a preventative tool. But often it is when you look back over a decision that regret comes out. Christian and Griffiths define regret as “the result of comparing what we actually did with what would have been best in hindsight.”

“Does the road wind uphill all the way?
Yes, to the very end.
Will the day’s journey take the whole long day?
From morn to night, my friend.
Shall I find comfort, travel-sore and weak?
Of labour you shall find the sum.
Will there be beds for me and all who seek?
Yea, beds for all who come.”
—Christina Rossetti, Up-Hill

If we want to minimize regret, especially in exploration, we can try to learn from those who have come before. As we choose to wander forth into new territory, however, it’s natural to wonder if we’ll regret our decision to try something new. According to Christian and Griffiths, the mathematics that underlie explore/exploit algorithms show that “you should assume the best about [new people and new things], in the absence of evidence to the contrary. In the long run, optimism is the best prevention for regret.” Why? Because by being optimistic about the possibilities that are out there, you’ll explore enough that the one thing you won’t regret is missed opportunity.

(This is similar to one of the most effective strategies in game theory: tit for tat. Start out by being nice, then reciprocate whatever behavior you receive. It often works better paired with the occasional bout of forgiveness.)

“Tell me, tell me, smiling child,
What the past is like to thee?
‘An Autumn evening soft and mild
With a wind that sighs mournfully.’

Tell me, what is the present hour?
‘A green and flowery spray
Where a young bird sits gathering its power
To mount and fly away.’

And what is the future, happy one?
‘A sea beneath a cloudless sun;
A mighty, glorious, dazzling sea
Stretching into infinity.’”
—Emily Bronte, Past, Present, Future

***

The accumulation of knowledge

Christian and Griffiths write that “it’s rare that we make an isolated decision, where the outcome doesn’t provide us with any information that we’ll use to make other decisions in the future.” Not all of our explorations are going to lead us to something better, but many of them are. Not all of our exploitations are going to be satisfying, but with enough exploration behind us, many of them will. Failures are, after all, just information we can use to make better explore or exploit decisions in the future.

“You know—at least you ought to know,
For I have often told you so—
That children are never allowed
To leave their nurses in a crowd.
Now this was Jim’s especial foible,
He ran away when he was able,
And on this inauspicious day
He slipped his hand and ran away!
He hadn’t gone a yard when—Bang!
With open jaws, a lion sprang,
And hungrily began to eat
The boy: beginning at his feet.”
—Hilaire Belloc, Jim Who Ran Away from His Nurse, and Was Eaten by a Lion

Most importantly, we shouldn’t let our early exploration mishaps prevent us from continuing to push our boundaries as we grow up. Exploration is necessary in order to exploit and enjoy the knowledge hard won along the way.

Mental Models for Career Changes

Career changes are some of the biggest moves we will ever make, but they don’t have to be daunting. Using mental models to make decisions we determine where we want to go and how to get there. The result is a change that aligns with the person we are, as well as the person we want to be.

We’ve all been there: you’re at a job, and you know it’s not for you anymore. You come in drained, you’re not excited on a Monday morning, and you feel like you could be using your time so much better. It’s not the people, and it’s not the organization. It’s the work. It’s become boring, unfulfilling, or redundant, and you know you want to do something different. But what?

Just deciding to change careers doesn’t get you very far because there are more areas to work in than you know about. A big change often involves some retraining. A career shift will impact your personal life. At the end of it all, you want to be happier but know there are no guarantees. How do you find a clear path forward?

No matter how ready you think you are to make a move, career changes are daunting. The stress of leaving what you’re comfortable with to venture into foreign territory stops many people from taking the first step toward something new.

It doesn’t have to be this way.

Using mental models can help you clarify the direction you want to go and plan for how to get there. They are tools that will give you more control over your career and more confidence in your decisions. When you do the work up front by examining your situation through the lens of a few mental models, you set yourself up for fewer regrets and more satisfaction down the road.

***

Get in touch with yourself

Before you can decide which change to make, you need to get in touch with yourself. No change will be the right one if it doesn’t align with what you want to get out of life.

First, do you know where you want to go? Are you moving with direction or just moving? As a mental model, velocity reminds us there is a difference between speed and direction. It’s easy to move fast without getting anywhere. We can stay busy all day without achieving our goals. Without considering our velocity, we run a huge risk of getting sidetracked by things that make us move faster (more money, a title on a business card) without that movement actually leading us where we want to end up.

As the old saying goes, we want to run to something, not from something. When you start articulating your desired direction, you give yourself clear purpose in your career. It will be easier to play the long game because you know that everything you are doing is leading somewhere you want to be.

When it comes to changing careers, there are a lot of options. Using the mental model of velocity will help you focus on and identify the best opportunities.

Once you know where you want to end up, it’s often useful to work backward to where you are now. This is known as inversion. Start at the end and carefully consider the events that get you there in reverse order.

For example, it could be something as simple as waking up happy and excited to work every day. What needs to be true in order for that to be a reality? Are you working from home, having a quiet cup of coffee as you prepare to do some creative work? Are you working on projects aligned with your values? Are you contributing to making the world a better place? Are you in an intense, collaborative team environment?

Doing an inversion exercise helps you identify the elements needed for you to achieve success. Once you identify your requirements, you can use that list to evaluate opportunities that come up.

Inversion will help you recognize critical factors, like finances or the support of your family, that will be necessary to get to where you want to go. If your dream direction requires you to learn a new skill or work at a junior level while you ramp up on the knowledge you’ll need, you might need to live off some savings in the short term. Inversion, combined with velocity, will help you create the foundation you need now to take action when the right time comes.

Finally, the last step before you start evaluating the career environment is taking stock of the skills you already have. Why do you need to do this? So you know what you can repurpose. Here, you’re using the concept of exaptation, which is part of the broader adaptation model in biology. Exaptation refers to traits that evolved for one purpose and then, through natural selection, were used for completely unrelated capabilities. For instance, feathers probably evolved for insulation. It was only much later that they turned out to be useful for flying.

History is littered with examples of technologies or tools invented for one purpose that later became the foundation for something completely different. Did you know that Play-Doh was originally created to clean coal soot off walls? And bubble wrap was originally envisioned as material for shower curtains.

Using this model is partly about getting out of the “functional fixedness” mindset. You want to look at your skills, talents, and knowledge and ask of each one: what else could this be used for?

Too often we fail to realize just how versatile the experience we’ve built up over the years is. We’re great at using forks to eat, but they can also be used to brush hair, dig in a garden, and pin things to walls. Being great at presenting the monthly status update doesn’t mean you’re good at presenting monthly status updates. Rather, it means you can articulate yourself well, parse information for a diverse audience, and build networks to get the right information. Now, what else can those skills be used for?

***

Evaluate the environment

Looking at different careers, we’re usually in a situation where the “map is not the territory.” It’s hard to know how great (or terrible) a job is until you actually do it. We often have two types of maps for the careers we wish we had: maps of the highlights, success stories, and opinions of people who love the work and maps based on how much we love the field or discipline ourselves.

The territory of the day-to-day work of these careers, however, is very different from what those two maps tell us.

In order to determine if a particular career will work for us, we need better maps. For example, the reality of being an actor isn’t just the movies and programs you see them in. It’s audition after audition, with more rejections than roles. It’s intense competition and job insecurity. Being a research scientist at a university isn’t just immersing yourself in a subject you love. It’s grant applications and teaching and navigating the bureaucracy of academia.

In order to build a more comprehensive map of your dream job, do your research on as large a sample size as possible. Talk to people doing the job you want. Talk to people who work in the organization. Talk to the ones that enjoy it. Talk to the ones who quit. Try to get an accurate picture of what the day-to-day is like.

Very few jobs are one-dimensional. They involve things like administrative tasks, networking, project management, and accountability. How much of your day will be spent doing paperwork or updating your coworkers? How much of a connection do you need to maintain with people outside the organization? How many people will you be dependent on? What are they like? And who will you be working for?

It’s not a good idea to become a writer just because you want to tell stories, open a restaurant just because you like to cook, or become a landscape designer just because you enjoy being outside. Those motivations are good places to start—because it’s equally terrible to become a lawyer just because your parents wanted you to. But you can’t stop with what you like. There isn’t a job in the world that’s pleasurable and fulfilling 100% of the time.

You give yourself a much higher chance of being satisfied with your career change if you take the time to learn as much as you can about the territory beforehand.

***

Elements of planning

You know which direction you’re heading in, and you’ve identified a great new career possibility. Now what?

Planning for change is a crucial component of switching careers. Two models, global and local maxima and activation energy, can help us identify what we need to plan.

Global and local maxima refers to the high values in a mathematical function. On a graph, it’s a wavy curve with peaks and valleys. The highest peak in a section is a local maximum. The highest peak across the entire graph is the global maximum. Activation energy comes from chemistry, and is the amount of energy needed to see a reaction through to its conclusion.

One of the things global and local maxima teaches us is that sometimes you have to go down a hill in order to climb up a new one. To move from a local maximum to a higher peak you have to go through a local minimum, a valley. Too often we just want to go higher right away, or at the very least we want to make a lateral move. We perceive going down as taking a step backward.

A common problem is when we tie our self-worth to our salary and therefore reject any opportunities that won’t pay us as much as we’re currently making. The same goes for job titles; no one wants to be a junior anything in their mid-forties. But it’s impossible to get to the next peak if we won’t walk through the valley.

If you look at your career change through the lens of global and local maxima, you will see that steps down can also be steps forward.

Activation energy is another great model to use in the planning phase because it requires you to think about the real effort required for sustained change. You need to plan not just for making a change but also for seeing it through until the new thing has time to take hold.

Do you have enough in the bank to support yourself if you need to retrain or take a pay cut? Do you have the emotional support to help you through the challenges of taking on a brand-new career?

Just like fires don’t start with one match and a giant log, you have to plan for what you need between now and your desired result. What do you need to keep that reaction going so the flame from the match leads to the log catching fire? The same kind of thinking needs to inform your planning. After you’ve taken the first step, what will you need to keep you moving in the direction you want to go?

***

After you’ve done all the work

After getting in touch with yourself, doing all your research, identifying possible paths, and planning for what you need to do to walk them to the end, it can still be hard to make a decision. You’ve uncovered so many nuances and encountered so many ideas that you feel overwhelmed. The reality is, when it comes to career change, there often is no perfect decision. You likely have more than one option, and whatever you choose, there’s going to be a lot of work involved.

One final model you can use is probabilistic thinking. In this particular situation, it can be helpful to use a Bayesian casino.

A Bayesian casino is a thought experiment where you imagine walking up to a casino game, like roulette, and quantifying how much you would bet on any particular outcome.

Let’s say when investigating your career change, you’ve narrowed it down to two options. Which one would you bet on for being the better choice one year later? And how much would you part with? If you’d bet ten dollars on black, then you probably need to take a fresh look at the research you’ve done. Maybe go talk to more people, or broaden your thinking. If you’re willing to put down thousands of dollars on red, that’s very likely the right decision for you.

It’s important in this thought experiment to fully imagine yourself making the bet. Imagine the money in your bank account. Imagine withdrawing it and physically putting it down on the table. How much you’re willing to part with regarding a particular career choice says a lot about how good that choice is likely to be for you.

Probabilistic thinking isn’t a predictor of the future. With any big career move, there are inevitably a lot of unknowns. There are no guarantees that any choice is going to be the right one. The Bayesian casino just helps you quantify your thinking based on the knowledge you have at this moment in time.

As new information comes in, return to the casino and see if your bets change.

***

Conclusion

Career changes are some of the biggest moves we will ever make, but they don’t have to be daunting. Using mental models helps us find both the direction we want to go and a path we can take to get there. The result is a change that aligns with the person we are, as well as the person we want to be.

A Primer on Algorithms and Bias

The growing influence of algorithms on our lives means we owe it to ourselves to better understand what they are and how they work. Understanding how the data we use to inform algorithms influences the results they give can help us avoid biases and make better decisions.

***

Algorithms are everywhere: driving our cars, designing our social media feeds, dictating which mixer we end up buying on Amazon, diagnosing diseases, and much more.

Two recent books explore algorithms and the data behind them. In Hello World: Being Human in the Age of Algorithms, mathematician Hannah Fry shows us the potential and the limitations of algorithms. And Invisible Women: Data Bias in a World Designed for Men by writer, broadcaster, and feminist activist Caroline Criado Perez demonstrates how we need to be much more conscientious of the quality of the data we feed into them.

Humans or algorithms?

First, what is an algorithm? Explanations of algorithms can be complex. Fry explains that at their core, they are defined as step-by-step procedures for solving a problem or achieving a particular end. We tend to use the term to refer to mathematical operations that crunch data to make decisions.

When it comes to decision-making, we don’t necessarily have to choose between doing it ourselves and relying wholly on algorithms. The best outcome may be a thoughtful combination of the two.

We all know that in certain contexts, humans are not the best decision-makers. For example, when we are tired, or when we already have a desired outcome in mind, we may ignore relevant information. In Thinking, Fast and Slow, Daniel Kahneman gave multiple examples from his research with Amos Tversky that demonstrated we are heavily influenced by cognitive biases such as availability and anchoring when making certain types of decisions. It’s natural, then, that we would want to employ algorithms that aren’t vulnerable to the same tendencies. In fact, their main appeal for use in decision-making is that they can override our irrationalities.

Algorithms, however, aren’t without their flaws. One of the obvious ones is that because algorithms are written by humans, we often code our biases right into them. Criado Perez offers many examples of algorithmic bias.

For example, an online platform designed to help companies find computer programmers looks through activity such as sharing and developing code in online communities, as well as visiting Japanese manga (comics) sites. People visiting certain sites with frequency received higher scores, thus making them more visible to recruiters.

However, Criado Perez presents the analysis of this recruiting algorithm by Cathy O’Neil, scientist and author of Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, who points out that “women, who do 75% of the world’s unpaid care work, may not have the spare leisure time to spend hours chatting about manga online . . . and if, like most of techdom, that manga site is dominated by males and has a sexist tone, a good number of women in the industry will probably avoid it.”

Criado Perez postulates that the authors of the recruiting algorithm didn’t intend to encode a bias that discriminates against women. But, she says, “if you aren’t aware of how those biases operate, if you aren’t collecting data and taking a little time to produce evidence-based processes, you will continue to blindly perpetuate old injustices.”

Fry also covers algorithmic bias and asserts that “wherever you look, in whatever sphere you examine, if you delve deep enough into any system at all, you’ll find some kind of bias.” We aren’t perfect—and we shouldn’t expect our algorithms to be perfect, either.

In order to have a conversation about the value of an algorithm versus a human in any decision-making context, we need to understand, as Fry explains, that “algorithms require a clear, unambiguous idea of exactly what we want them to achieve and a solid understanding of the human failings they are replacing.”

Garbage in, garbage out

No algorithm is going to be successful if the data it uses is junk. And there’s a lot of junk data in the world. Far from being a new problem, Criado Perez argues that “most of recorded human history is one big data gap.” And that has a serious negative impact on the value we are getting from our algorithms.

Criado Perez explains the situation this way: We live in “a world [that is] increasingly reliant on and in thrall to data. Big data. Which in turn is panned for Big Truths by Big Algorithms, using Big Computers. But when your data is corrupted by big silences, the truths you get are half-truths, at best.”

A common human bias is one regarding the universality of our own experience. We tend to assume that what is true for us is generally true across the population. We have a hard enough time considering how things may be different for our neighbors, let alone for other genders or races. It becomes a serious problem when we gather data about one subset of the population and mistakenly assume that it represents all of the population.

For example, Criado Perez examines the data gap in relation to incorrect information being used to inform decisions about safety and women’s bodies. From personal protective equipment like bulletproof vests that don’t fit properly and thus increase the chances of the women wearing them getting killed to levels of exposure to toxins that are unsafe for women’s bodies, she makes the case that without representative data, we can’t get good outputs from our algorithms. She writes that “we continue to rely on data from studies done on men as if they apply to women. Specifically, Caucasian men aged twenty-five to thirty, who weigh 70 kg. This is ‘Reference Man’ and his superpower is being able to represent humanity as whole. Of course, he does not.” Her book contains a wide variety of disciplines and situations where the gender gap in data leads to increased negative outcomes for women.

The limits of what we can do

Although there is a lot we can do better when it comes to designing algorithms and collecting the data sets that feed them, it’s also important to consider their limits.

We need to accept that algorithms can’t solve all problems, and there are limits to their functionality. In Hello World, Fry devotes a chapter to the use of algorithms in justice. Specifically, algorithms designed to provide information to judges about the likelihood of a defendant committing further crimes. Our first impulse is to say, “Let’s not rely on bias here. Let’s not have someone’s skin color or gender be a key factor for the algorithm.” After all, we can employ that kind of bias just fine ourselves. But simply writing bias out of an algorithm is not as easy as wishing it so. Fry explains that “unless the fraction of people who commit crimes is the same in every group of defendants, it is mathematically impossible to create a test which is equally accurate at predicting across the board and makes false positive and false negative mistakes at the same rate for every group of defendants.”

Fry comes back to such limits frequently throughout her book, exploring them in various disciplines. She demonstrates to the reader that “there are boundaries to the reach of algorithms. Limits to what can be quantified.” Perhaps a better understanding of those limits is needed to inform our discussions of where we want to use algorithms.

There are, however, other limits that we can do something about. Both authors make the case for more education about algorithms and their input data. Lack of understanding shouldn’t hold us back. Algorithms that have a significant impact on our lives specifically need to be open to scrutiny and analysis. If an algorithm is going to put you in jail or impact your ability to get a mortgage, then you ought to be able to have access to it.

Most algorithm writers and the companies they work for wave the “proprietary” flag and refuse to open themselves up to public scrutiny. Many algorithms are a black box—we don’t actually know how they reach the conclusions they do. But Fry says that shouldn’t deter us. Pursuing laws (such as the data access and protection rights being instituted in the European Union) and structures (such as an algorithm-evaluating body playing a role similar to the one the U.S. Food and Drug Administration plays in evaluating whether pharmaceuticals can be made available to the U.S. market) will help us decide as a society what we want and need our algorithms to do.

Where do we go from here?

Algorithms aren’t going away, so it’s best to acquire the knowledge needed to figure out how they can help us create the world we want.

Fry suggests that one way to approach algorithms is to “imagine that we designed them to support humans in their decisions, rather than instruct them.” She envisions a world where “the algorithm and the human work together in partnership, exploiting each other’s strengths and embracing each other’s flaws.”

Part of getting to a world where algorithms provide great benefit is to remember how diverse our world really is and make sure we get data that reflects the realities of that diversity. We can either actively change the algorithm, or we change the data set. And if we do the latter, we need to make sure we aren’t feeding our algorithms data that, for example, excludes half the population. As Criado Perez writes, “when we exclude half of humanity from the production of knowledge, we lose out on potentially transformative insights.”

Given how complex the world of algorithms is, we need all the amazing insights we can get. Algorithms themselves perhaps offer the best hope, because they have the inherent flexibility to improve as we do.

Fry gives this explanation: “There’s nothing inherent in [these] algorithms that means they have to repeat the biases of the past. It all comes down to the data you give them. We can choose to be ‘crass empiricists’ (as Richard Berk put it ) and follow the numbers that are already there, or we can decide that the status quo is unfair and tweak the numbers accordingly.”

We can get excited about the possibilities that algorithms offer us and use them to create a world that is better for everyone.

Why We Focus on Trivial Things: The Bikeshed Effect

Bikeshedding is a metaphor to illustrate the strange tendency we have to spend excessive time on trivial matters, often glossing over important ones. Here’s why we do it, and how to stop.

***

How can we stop wasting time on unimportant details? From meetings at work that drag on forever without achieving anything to weeks-long email chains that don’t solve the problem at hand, we seem to spend an inordinate amount of time on the inconsequential. Then, when an important decision needs to be made, we hardly have any time to devote to it.

To answer this question, we first have to recognize why we get bogged down in the trivial. Then we must look at strategies for changing our dynamics towards generating both useful input and time to consider it.

The Law of Triviality

You’ve likely heard of Parkinson’s Law, which states that tasks expand to fill the amount of time allocated to them. But you might not have heard of the lesser-known Parkinson’s Law of Triviality, also coined by British naval historian and author Cyril Northcote Parkinson in the 1950s.

The Law of Triviality states that the amount of time spent discussing an issue in an organization is inversely correlated to its actual importance in the scheme of things. Major, complex issues get the least discussion while simple, minor ones get the most discussion.

Parkinson’s Law of Triviality is also known as “bike-shedding,” after the story Parkinson uses to illustrate it. He asks readers to imagine a financial committee meeting to discuss a three-point agenda. The points are as follows:

  1. A proposal for a £10 million nuclear power plant
  2. A proposal for a £350 bike shed
  3. A proposal for a £21 annual coffee budget

What happens? The committee ends up running through the nuclear power plant proposal in little time. It’s too advanced for anyone to really dig into the details, and most of the members don’t know much about the topic in the first place. One member who does is unsure how to explain it to the others. Another member proposes a redesigned proposal, but it seems like such a huge task that the rest of the committee decline to consider it.

The discussion soon moves to the bike shed. Here, the committee members feel much more comfortable voicing their opinions. They all know what a bike shed is and what it looks like. Several members begin an animated debate over the best possible material for the roof, weighing out options that might enable modest savings. They discuss the bike shed for far longer than the power plant.

At last, the committee moves onto item three: the coffee budget. Suddenly, everyone’s an expert. They all know about coffee and have a strong sense of its cost and value. Before anyone realizes what is happening, they spend longer discussing the £21 coffee budget than the power plant and the bike shed combined! In the end, the committee runs out of time and decides to meet again to complete their analysis. Everyone walks away feeling satisfied, having contributed to the conversation.

Why this happens

Bike-shedding happens because the simpler a topic is, the more people will have an opinion on it and thus more to say about it. When something is outside of our circle of competence, like a nuclear power plant, we don’t even try to articulate an opinion.

But when something is just about comprehensible to us, even if we don’t have anything of genuine value to add, we feel compelled to say something, lest we look stupid. What idiot doesn’t have anything to say about a bike shed? Everyone wants to show that they know about the topic at hand and have something to contribute.

With any issue, we shouldn’t be according equal importance to every opinion anyone adds. We should emphasize the inputs from those who have done the work to have an opinion. And when we decide to contribute, we should be putting our energy into the areas where we have something valuable to add that will improve the outcome of the decision.

Strategies for avoiding bike-shedding

The main thing you can do to avoid bike-shedding is for your meeting to have a clear purpose. In The Art of Gathering: How We Meet and Why It Matters, Priya Parker, who has decades of experience designing high-stakes gatherings, says that any successful gathering (including a business meeting) needs to have a focused and particular purpose. “Specificity,” she says, “is a crucial ingredient.”

Why is having a clear purpose so critical? Because you use it as the lens to filter all other decisions about your meeting, including who to have in the room.

With that in mind, we can see that it’s probably not a great idea to discuss building a nuclear power plant and a bike shed in the same meeting. There’s not enough specificity there.

The key is to recognize that the available input on an issue doesn’t all need considering. The most informed opinions are most relevant. This is one reason why big meetings with lots of people present, most of whom don’t need to be there, are such a waste of time in organizations. Everyone wants to participate, but not everyone has anything meaningful to contribute.

When it comes to choosing your list of invitees, Parker writes, “if the purpose of your meeting is to make a decision, you may want to consider having fewer cooks in the kitchen.” If you don’t want bike-shedding to occur, avoid inviting contributions from those who are unlikely to have relevant knowledge and experience. Getting the result you want—a thoughtful, educated discussion about that power plant—depends on having the right people in the room.

It also helps to have a designated individual in charge of making the final judgment. When we make decisions by committee with no one in charge, reaching a consensus can be almost impossible. The discussion drags on and on. The individual can decide in advance how much importance to accord to the issue (for instance, by estimating how much its success or failure could help or harm the company’s bottom line). They can set a time limit for the discussion to create urgency. And they can end the meeting by verifying that it has indeed achieved its purpose.

Any issue that invites a lot of discussions from different people might not be the most important one at hand. Avoid descending into unproductive triviality by having clear goals for your meeting and getting the best people to the table to have a productive, constructive discussion.