Tag: Culture

The Pygmalion Effect: Proving Them Right

The Pygmalion Effect is a powerful secret weapon. Without even realizing it, we can nudge others towards success. In this article, discover how expectations can influence performance for better or worse.

How Expectations Influence Performance

Many people believe that their pets or children are of unusual intelligence or can understand everything they say. Some people have stories of abnormal feats. In the late 19th century, one man claimed that about his horse and appeared to have evidence. William Von Osten was a teacher and horse trainer. He believed that animals could learn to read or count. Von Osten’s initial attempts with dogs and a bear were unsuccessful, but when he began working with an unusual horse, he changed our understanding of psychology. Known as Clever Hans, the animal could answer questions, with 90% accuracy, by tapping his hoof. He could add, subtract, multiply, divide, and tell the time and the date.

Clever Hans could also read and understand questions written or asked in German. Crowds flocked to see the horse, and the scientific community soon grew interested. Researchers studied the horse, looking for signs of trickery. Yet they found none. The horse could answer questions asked by anyone, even if Von Osten was absent. This indicated that no signaling was at play. For a while, the world believed the horse was truly clever.

Then psychologist Oskar Pfungst turned his attention to Clever Hans. Assisted by a team of researchers, he uncovered two anomalies. When blinkered or behind a screen, the horse could not answer questions. Likewise, he could respond only if the questioner knew the answer. From these observations, Pfungst deduced that Clever Hans was not making any mental calculations. Nor did he understand numbers or language in the human sense. Although Von Osten had intended no trickery, the act was false.

Instead, Clever Hans had learned to detect subtle, yet consistent nonverbal cues. When someone asked a question, Clever Hans responded to their body language with a degree of accuracy many poker players would envy. For example, when someone asked Clever Hans to make a calculation, he would begin tapping his hoof. Once he reached the correct answer, the questioner would show involuntary signs. Pfungst found that many people tilted their head at this point. Clever Hans would recognize this behavior and stop. When blinkered or when the questioner did not know the answer, the horse didn’t have a clue. When he couldn’t see the cues, he had no answer.

The Pygmalion Effect

Von Osten died in 1909 and Clever Hans disappeared from record. But his legacy lives on in a particular branch of psychology.

The case of Clever Hans is of less interest than the research it went on to provoke. Psychologists working in the decades following began to study how the expectations of others affect us. If someone expected Clever Hans to answer a question and ensured that he knew it, could the same thing occur elsewhere?

Could we be, at times, responding to subtle cues? Decades of research have provided consistent, robust evidence that the answer is yes. It comes down to the concepts of the self-fulfilling prophecy and the Pygmalion effect.

The Pygmalion effect is a psychological phenomenon wherein high expectations lead to improved performance in a given area. Its name comes from the story of Pygmalion, a mythical Greek sculptor. Pygmalion carved a statue of a woman and then became enamored with it. Unable to love a human, Pygmalion appealed to Aphrodite, the goddess of love. She took pity and brought the statue to life. The couple married and went on to have a daughter, Paphos.

False Beliefs Come True Over Time

In the same way Pygmalion’s fixation on the statue brought it to life, our focus on a belief or assumption can do the same. The flipside is the Golem effect, wherein low expectations lead to decreased performance. Both effects come under the category of self-fulfilling prophecies. Whether the expectation comes from us or others, the effect manifests in the same way.

The Pygmalion effect has profound ramifications in schools and organizations and with regard to social class and stereotypes. By some estimations, it is the result of our brains’ poorly distinguishing between perception and expectation. Although many people purport to want to prove their critics wrong, we often merely end up proving our supporters right.

Understanding the Pygmalion effect is a powerful way to positively affect those around us, from our children and friends to employees and leaders. If we don’t take into account the ramifications of our expectations, we may miss out on the dramatic benefits of holding high standards.

The concept of a self-fulfilling prophecy is attributed to sociologist Robert K. Merton. In 1948, Merton published the first paper on the topic. In it, he described the phenomenon as a false belief that becomes true over time. Once this occurs, it creates a feedback loop. We assume we were always correct because it seems so in hindsight. Merton described a self-fulfilling prophecy as self-hypnosis through our own propaganda.

As with many psychological concepts, people had a vague awareness of its existence long before research confirmed anything. Renowned orator and theologian Jacques Benigne Bossuet declared in the 17th century that “The greatest weakness of all weaknesses is to fear too much to appear weak.”

Even Sigmund Freud was aware of self-fulfilling prophecies. In A Childhood Memory of Goethe, Freud wrote: “If a man has been his mother’s undisputed darling he retains throughout life the triumphant feeling, the confidence in success, which not seldom brings actual success with it.”

The IQ of Students

Research by Robert Rosenthal and Lenore Jacobson examined the influence of teachers’ expectations on students’ performance. Their subsequent paper is one of the most cited and discussed psychological studies ever conducted.

Rosenthal and Jacobson began by testing the IQ of elementary school students. Teachers were told that the IQ test showed around one-fifth of their students to be unusually intelligent. For ethical reasons, they did not label an alternate group as unintelligent and instead used unlabeled classmates as the control group. It will doubtless come as no surprise that the “gifted” students were chosen at random. They should not have had a significant statistical advantage over their peers. As the study period ended, all students had their IQs retested. Both groups showed an improvement. Yet those who were described as intelligent experienced much greater gains in their IQ points. Rosenthal and Jacobson attributed this result to the Pygmalion effect. Teachers paid more attention to “gifted” students, offering more support and encouragement than they would otherwise. Picked at random, those children ended up excelling. Sadly, no follow-up studies were ever conducted, so we do not know the long-term impact on the children involved.

Prior to studying the effect on children, Rosenthal performed preliminary research on animals. Students were given rats from two groups, one described as “maze dull” and the other as “maze bright.” Researchers claimed that the former group could not learn to properly negotiate a maze, but the latter could with ease. As you might expect, the groups of rats were the same. Like the gifted and nongifted children, they were chosen at random. Yet by the time the study finished, the “maze-bright” rats appeared to have learned faster. The students considered them tamer and more pleasant to work with than the “maze-dull” rats.

In general, authority figures have the power to influence how the people subordinate to them behave by holding high expectations. Whether consciously or not, leaders facilitate changes in behavior, such as by giving people more responsibility or setting stretch goals. Like the subtle cues that allowed Clever Hans to make calculations, these small changes in treatment can promote learning and growth. If a leader thinks an employee is competent, they will treat them as such. The employee then gets more opportunities to develop their competence, and their performance improves in a positive feedback loop. This works both ways. When we expect an authority figure to be competent or successful, we tend to be attentive and supportive. In the process, we bolster their performance, too. Students who act interested in lectures create interesting lecturers.

In Pygmalion in Management, J. Sterling Livingston writes,

Some managers always treat their subordinates in a way that leads to superior performance. But most … unintentionally treat their subordinates in a way that leads to lower performance than they are capable of achieving. The way managers treat their subordinates is subtly influenced by what they expect of them. If manager’s expectations are high, productivity is likely to be excellent. If their expectations are low, productivity is likely to be poor. It is as though there were a law that caused subordinates’ performance to rise or fall to meet managers’ expectations.

The Pygmalion effect shows us that our reality is negotiable and can be manipulated by others — on purpose or by accident. What we achieve, how we think, how we act, and how we perceive our capabilities can be influenced by the expectations of those around us. Those expectations may be the result of biased or irrational thinking, but they have the power to affect us and change what happens. While cognitive biases distort only what we perceive, self-fulfilling prophecies alter what happens.

Of course, the Pygmalion effect works only when we are physically capable of achieving what is expected of us. After Rosenthal and Jacobson published their initial research, many people were entranced by the implication that we are all capable of more than we think. Although that can be true, we have no indication that any of us can do anything if someone believes we can. Instead, the Pygmalion effect seems to involve us leveraging our full capabilities and avoiding the obstacles created by low expectations.

Clever Hans truly was an intelligent horse, but he was smart because he could read almost imperceptible nonverbal cues, not because he could do math. So, he did have unusual capabilities, as shown by the fact that few other animals have done what he did.

We can’t do anything just because someone expects us to. Overly high expectations can also be stressful. When someone sets the bar too high, we can get discouraged and not even bother trying. Stretch goals and high expectations are beneficial, up to the point of diminishing returns. Research by McClelland and Atkinson indicates that the Pygmalion effect drops off if we see our chance of success as being less than 50%. If an endeavor seems either certain or completely uncertain, the Pygmalion effect does not hold. When we are stretched but confident, high expectations can help us achieve more.

Check Your Assumptions

In Self-Fulfilling Prophecy: A Practical Guide to Its Use in Education, Robert T. Tauber describes an exercise in which people are asked to list their assumptions about people with certain descriptions. These included a cheerleader, “a minority woman with four kids at the market using food stamps,” and a “person standing outside smoking on a cold February day.” An anonymous survey of undergraduate students revealed mostly negative assumptions. Tauber asks the reader to consider how being exposed to these types of assumptions might affect someone’s day-to-day life.

The expectations people have of us affect us in countless subtle ways each day. Although we rarely notice it (unless we are on the receiving end of overt racism, sexism, and other forms of bias), those expectations dictate the opportunities we are offered, how we are spoken to, and the praise and criticism we receive. Individually, these knocks and nudges have minimal impact. In the long run, they might dictate whether we succeed or fail or fall somewhere on the spectrum in between.

The important point to note about the Pygmalion effect is that it creates a literal change in what occurs. There is nothing mystical about the effect. When we expect someone to perform well in any capacity, we treat them in a different way. Teachers tend to show more positive body language towards students they expect to be gifted. They may teach them more challenging material, offer more chances to ask questions, and provide personalized feedback. As Carl Sagan declared, “The visions we offer our children shape the future. It matters what those visions are. Often they become self-fulfilling prophecies. Dreams are maps.”

A perfect illustration is the case of James Sweeney and George Johnson, as described in Pygmalion in Management. Sweeney was a teacher at Tulane University, where Johnson worked as a porter. Aware of the Pygmalion effect, Sweeney had a hunch that he could teach anyone to be a competent computer operator. He began his experiment, offering Johnson lessons each afternoon. Other university staff were dubious, especially as Johnson appeared to have a low IQ. But the Pygmalion effect won out and the former janitor eventually became responsible for training new computer operators.

The Pygmalion effect is a powerful secret weapon. Who wouldn’t want to help their children get smarter, help employees and leaders be more competent, and generally push others to do well? That’s possible if we raise our standards and see others in the best possible light. It is not necessary to actively attempt to intervene. Without even realizing it, we can nudge others towards success. If that sounds too good to be true, remember that the effect holds up for everything from rats to CEOs.

Members of our Learning Community can discuss this article here.

First Principles: The Building Blocks of True Knowledge

First Principles

First-principles thinking is one of the best ways to reverse-engineer complicated problems and unleash creative possibility. Sometimes called “reasoning from first principles,” the idea is to break down complicated problems into basic elements and then reassemble them from the ground up. It’s one of the best ways to learn to think for yourself, unlock your creative potential, and move from linear to non-linear results.

This approach was used by the philosopher Aristotle and is used now by Elon Musk and Charlie Munger. It allows them to cut through the fog of shoddy reasoning and inadequate analogies to see opportunities that others miss.

“I don’t know what’s the matter with people: they don’t learn by understanding; they learn by some other way—by rote or something. Their knowledge is so fragile!”

— Richard Feynman

The Basics

A first principle is a foundational proposition or assumption that stands alone. We cannot deduce first principles from any other proposition or assumption.

Aristotle, writing[1] on first principles, said:

In every systematic inquiry (methodos) where there are first principles, or causes, or elements, knowledge and science result from acquiring knowledge of these; for we think we know something just in case we acquire knowledge of the primary causes, the primary first principles, all the way to the elements.

Later he connected the idea to knowledge, defining first principles as “the first basis from which a thing is known.”[2]

The search for first principles is not unique to philosophy. All great thinkers do it.

Reasoning by first principles removes the impurity of assumptions and conventions. What remains is the essentials. It’s one of the best mental models you can use to improve your thinking because the essentials allow you to see where reasoning by analogy might lead you astray.

The Coach and the Play Stealer

My friend Mike Lombardi (a former NFL executive) and I were having dinner in L.A. one night, and he said, “Not everyone that’s a coach is really a coach. Some of them are just play stealers.”

Every play we see in the NFL was at some point created by someone who thought, “What would happen if the players did this?” and went out and tested the idea. Since then, thousands, if not millions, of plays have been created. That’s part of what coaches do. They assess what’s physically possible, along with the weaknesses of the other teams and the capabilities of their own players, and create plays that are designed to give their teams an advantage.

The coach reasons from first principles. The rules of football are the first principles: they govern what you can and can’t do. Everything is possible as long as it’s not against the rules.

The play stealer works off what’s already been done. Sure, maybe he adds a tweak here or there, but by and large he’s just copying something that someone else created.

While both the coach and the play stealer start from something that already exists, they generally have different results. These two people look the same to most of us on the sidelines or watching the game on the TV. Indeed, they look the same most of the time, but when something goes wrong, the difference shows. Both the coach and the play stealer call successful plays and unsuccessful plays. Only the coach, however, can determine why a play was successful or unsuccessful and figure out how to adjust it. The coach, unlike the play stealer, understands what the play was designed to accomplish and where it went wrong, so he can easily course-correct. The play stealer has no idea what’s going on. He doesn’t understand the difference between something that didn’t work and something that played into the other team’s strengths.

Musk would identify the play stealer as the person who reasons by analogy, and the coach as someone who reasons by first principles. When you run a team, you want a coach in charge and not a play stealer. (If you’re a sports fan, you need only look at the difference between the Cleveland Browns and the New England Patriots.)

We’re all somewhere on the spectrum between coach and play stealer. We reason by first principles, by analogy, or a blend of the two.

Another way to think about this distinction comes from another friend, Tim Urban. He says[3] it’s like the difference between the cook and the chef. While these terms are often used interchangeably, there is an important nuance. The chef is a trailblazer, the person who invents recipes. He knows the raw ingredients and how to combine them. The cook, who reasons by analogy, uses a recipe. He creates something, perhaps with slight variations, that’s already been created.

The difference between reasoning by first principles and reasoning by analogy is like the difference between being a chef and being a cook. If the cook lost the recipe, he’d be screwed. The chef, on the other hand, understands the flavor profiles and combinations at such a fundamental level that he doesn’t even use a recipe. He has real knowledge as opposed to know-how.

Authority

So much of what we believe is based on some authority figure telling us that something is true. As children, we learn to stop questioning when we’re told “Because I said so.” (More on this later.) As adults, we learn to stop questioning when people say “Because that’s how it works.” The implicit message is “understanding be damned — shut up and stop bothering me.” It’s not intentional or personal. OK, sometimes it’s personal, but most of the time, it’s not.

If you outright reject dogma, you often become a problem: a student who is always pestering the teacher. A kid who is always asking questions and never allowing you to cook dinner in peace. An employee who is always slowing things down by asking why.

When you can’t change your mind, though, you die. Sears was once thought indestructible before Wal-Mart took over. Sears failed to see the world change. Adapting to change is an incredibly hard thing to do when it comes into conflict with the very thing that caused so much success. As Upton Sinclair aptly pointed out, “It is difficult to get a man to understand something, when his salary depends on his not understanding it.” Wal-Mart failed to see the world change and is now under assault from Amazon.

If we never learn to take something apart, test the assumptions, and reconstruct it, we end up trapped in what other people tell us — trapped in the way things have always been done. When the environment changes, we just continue as if things were the same.

First-principles reasoning cuts through dogma and removes the blinders. We can see the world as it is and see what is possible.

When it comes down to it, everything that is not a law of nature is just a shared belief. Money is a shared belief. So is a border. So are bitcoins. The list goes on.

Some of us are naturally skeptical of what we’re told. Maybe it doesn’t match up to our experiences. Maybe it’s something that used to be true but isn’t true anymore. And maybe we just think very differently about something.

“To understand is to know what to do.”

— Wittgenstein

Techniques for Establishing First Principles

There are many ways to establish first principles. Let’s take a look at a few of them.

Socratic Questioning

Socratic questioning can be used to establish first principles through stringent analysis. This a disciplined questioning process, used to establish truths, reveal underlying assumptions, and separate knowledge from ignorance. The key distinction between Socratic questioning and normal discussions is that the former seeks to draw out first principles in a systematic manner. Socratic questioning generally follows this process:

  1. Clarifying your thinking and explaining the origins of your ideas (Why do I think this? What exactly do I think?)
  2. Challenging assumptions (How do I know this is true? What if I thought the opposite?)
  3. Looking for evidence (How can I back this up? What are the sources?)
  4. Considering alternative perspectives (What might others think? How do I know I am correct?)
  5. Examining consequences and implications (What if I am wrong? What are the consequences if I am?)
  6. Questioning the original questions (Why did I think that? Was I correct? What conclusions can I draw from the reasoning process?)

This process stops you from relying on your gut and limits strong emotional responses. This process helps you build something that lasts.

“Because I Said So” or “The Five Whys”

Children instinctively think in first principles. Just like us, they want to understand what’s happening in the world. To do so, they intuitively break through the fog with a game some parents have come to hate.

“Why?”

“Why?”

“Why?”

Here’s an example that has played out numerous times at my house:

“It’s time to brush our teeth and get ready for bed.”

“Why?”

“Because we need to take care of our bodies, and that means we need sleep.”

“Why do we need sleep?”

“Because we’d die if we never slept.”

“Why would that make us die?”

“I don’t know; let’s go look it up.”

Kids are just trying to understand why adults are saying something or why they want them to do something.

The first time your kid plays this game, it’s cute, but for most teachers and parents, it eventually becomes annoying. Then the answer becomes what my mom used to tell me: “Because I said so!” (Love you, Mom.)

Of course, I’m not always that patient with the kids. For example, I get testy when we’re late for school, or we’ve been travelling for 12 hours, or I’m trying to fit too much into the time we have. Still, I try never to say “Because I said so.”

People hate the “because I said so” response for two reasons, both of which play out in the corporate world as well. The first reason we hate the game is that we feel like it slows us down. We know what we want to accomplish, and that response creates unnecessary drag. The second reason we hate this game is that after one or two questions, we are often lost. We actually don’t know why. Confronted with our own ignorance, we resort to self-defense.

I remember being in meetings and asking people why we were doing something this way or why they thought something was true. At first, there was a mild tolerance for this approach. After three “whys,” though, you often find yourself on the other end of some version of “we can take this offline.”

Can you imagine how that would play out with Elon Musk? Richard Feynman? Charlie Munger? Musk would build a billion-dollar business to prove you wrong, Feynman would think you’re an idiot, and Munger would profit based on your inability to think through a problem.

“Science is a way of thinking much more than it is a body of knowledge.”

— Carl Sagan

Examples of First Principles in Action

So we can better understand how first-principles reasoning works, let’s look at four examples.

Elon Musk and SpaceX

Perhaps no one embodies first-principles thinking more than Elon Musk. He is one of the most audacious entrepreneurs the world has ever seen. My kids (grades 3 and 2) refer to him as a real-life Tony Stark, thereby conveniently providing a good time for me to remind them that by fourth grade, Musk was reading the Encyclopedia Britannica and not Pokemon.

What’s most interesting about Musk is not what he thinks but how he thinks:

I think people’s thinking process is too bound by convention or analogy to prior experiences. It’s rare that people try to think of something on a first principles basis. They’ll say, “We’ll do that because it’s always been done that way.” Or they’ll not do it because “Well, nobody’s ever done that, so it must not be good. But that’s just a ridiculous way to think. You have to build up the reasoning from the ground up—“from the first principles” is the phrase that’s used in physics. You look at the fundamentals and construct your reasoning from that, and then you see if you have a conclusion that works or doesn’t work, and it may or may not be different from what people have done in the past.[4]

His approach to understanding reality is to start with what is true — not with his intuition. The problem is that we don’t know as much as we think we do, so our intuition isn’t very good. We trick ourselves into thinking we know what’s possible and what’s not. The way Musk thinks is much different.

Musk starts out with something he wants to achieve, like building a rocket. Then he starts with the first principles of the problem. Running through how Musk would think, Larry Page said in an

interview, “What are the physics of it? How much time will it take? How much will it cost? How much cheaper can I make it? There’s this level of engineering and physics that you need to make judgments about what’s possible and interesting. Elon is unusual in that he knows that, and he also knows business and organization and leadership and governmental issues.”[5]

Rockets are absurdly expensive, which is a problem because Musk wants to send people to Mars. And to send people to Mars, you need cheaper rockets. So he asked himself, “What is a rocket made of? Aerospace-grade aluminum alloys, plus some titanium, copper, and carbon fiber. And … what is the value of those materials on the commodity market? It turned out that the materials cost of a rocket was around two percent of the typical price.”[6]

Why, then, is it so expensive to get a rocket into space? Musk, a notorious self-learner with degrees in both economics and physics, literally taught himself rocket science. He figured that the only reason getting a rocket into space is so expensive is that people are stuck in a mindset that doesn’t hold up to first principles. With that, Musk decided to create SpaceX and see if he could build rockets himself from the ground up.

In an interview with Kevin Rose, Musk summarized his approach:

I think it’s important to reason from first principles rather than by analogy. So the normal way we conduct our lives is, we reason by analogy. We are doing this because it’s like something else that was done, or it is like what other people are doing… with slight iterations on a theme. And it’s … mentally easier to reason by analogy rather than from first principles. First principles is kind of a physics way of looking at the world, and what that really means is, you … boil things down to the most fundamental truths and say, “okay, what are we sure is true?” … and then reason up from there. That takes a lot more mental energy.[7]

Musk then gave an example of how Space X uses first principles to innovate at low prices:

Somebody could say — and in fact people do — that battery packs are really expensive and that’s just the way they will always be because that’s the way they have been in the past. … Well, no, that’s pretty dumb… Because if you applied that reasoning to anything new, then you wouldn’t be able to ever get to that new thing…. you can’t say, … “oh, nobody wants a car because horses are great, and we’re used to them and they can eat grass and there’s lots of grass all over the place and … there’s no gasoline that people can buy….”

He then gives a fascinating example about battery packs:

… they would say, “historically, it costs $600 per kilowatt-hour. And so it’s not going to be much better than that in the future. … So the first principles would be, … what are the material constituents of the batteries? What is the spot market value of the material constituents? … It’s got cobalt, nickel, aluminum, carbon, and some polymers for separation, and a steel can. So break that down on a material basis; if we bought that on a London Metal Exchange, what would each of these things cost? Oh, jeez, it’s … $80 per kilowatt-hour. So, clearly, you just need to think of clever ways to take those materials and combine them into the shape of a battery cell, and you can have batteries that are much, much cheaper than anyone realizes.

BuzzFeed

After studying the psychology of virality, Jonah Peretti founded BuzzFeed in 2006. The site quickly grew to be one of the most popular on the internet, with hundreds of employees and substantial revenue.

Peretti figured out early on the first principle of a successful website: wide distribution. Rather than publishing articles people should read, BuzzFeed focuses on publishing those that people want to read. This means aiming to garner maximum social shares to put distribution in the hands of readers.

Peretti recognized the first principles of online popularity and used them to take a new approach to journalism. He also ignored SEO, saying, “Instead of making content robots like, it was more satisfying to make content humans want to share.”[8] Unfortunately for us, we share a lot of cat videos.

A common aphorism in the field of viral marketing is, “content might be king, but distribution is queen, and she wears the pants” (or “and she has the dragons”; pick your metaphor). BuzzFeed’s distribution-based approach is based on obsessive measurement, using A/B testing and analytics.

Jon Steinberg, president of BuzzFeed, explains the first principles of virality:

Keep it short. Ensure [that] the story has a human aspect. Give people the chance to engage. And let them react. People mustn’t feel awkward sharing it. It must feel authentic. Images and lists work. The headline must be persuasive and direct.

Derek Sivers and CD Baby

When Sivers founded his company CD Baby, he reduced the concept down to first principles. Sivers asked, What does a successful business need? His answer was happy customers.

Instead of focusing on garnering investors or having large offices, fancy systems, or huge numbers of staff, Sivers focused on making each of his customers happy. An example of this is his famous order confirmation email, part of which reads:

Your CD has been gently taken from our CD Baby shelves with sterilized contamination-free gloves and placed onto a satin pillow. A team of 50 employees inspected your CD and polished it to make sure it was in the best possible condition before mailing. Our packing specialist from Japan lit a candle and a hush fell over the crowd as he put your CD into the finest gold-lined box money can buy.

By ignoring unnecessary details that cause many businesses to expend large amounts of money and time, Sivers was able to rapidly grow the company to $4 million in monthly revenue. In Anything You Want, Sivers wrote:

Having no funding was a huge advantage for me.
A year after I started CD Baby, the dot-com boom happened. Anyone with a little hot air and a vague plan was given millions of dollars by investors. It was ridiculous. …
Even years later, the desks were just planks of wood on cinder blocks from the hardware store. I made the office computers myself from parts. My well-funded friends would spend $100,000 to buy something I made myself for $1,000. They did it saying, “We need the very best,” but it didn’t improve anything for their customers. …
It’s counterintuitive, but the way to grow your business is to focus entirely on your existing customers. Just thrill them, and they’ll tell everyone.

To survive as a business, you need to treat your customers well. And yet so few of us master this principle.

Employing First Principles in Your Daily Life

Most of us have no problem thinking about what we want to achieve in life, at least when we’re young. We’re full of big dreams, big ideas, and boundless energy. The problem is that we let others tell us what’s possible, not only when it comes to our dreams but also when it comes to how we go after them. And when we let other people tell us what’s possible or what the best way to do something is, we outsource our thinking to someone else.

The real power of first-principles thinking is moving away from incremental improvement and into possibility. Letting others think for us means that we’re using their analogies, their conventions, and their possibilities. It means we’ve inherited a world that conforms to what they think. This is incremental thinking.

When we take what already exists and improve on it, we are in the shadow of others. It’s only when we step back, ask ourselves what’s possible, and cut through the flawed analogies that we see what is possible. Analogies are beneficial; they make complex problems easier to communicate and increase understanding. Using them, however, is not without a cost. They limit our beliefs about what’s possible and allow people to argue without ever exposing our (faulty) thinking. Analogies move us to see the problem in the same way that someone else sees the problem.

The gulf between what people currently see because their thinking is framed by someone else and what is physically possible is filled by the people who use first principles to think through problems.

First-principles thinking clears the clutter of what we’ve told ourselves and allows us to rebuild from the ground up. Sure, it’s a lot of work, but that’s why so few people are willing to do it. It’s also why the rewards for filling the chasm between possible and incremental improvement tend to be non-linear.

Let’s take a look at a few of the limiting beliefs that we tell ourselves.

“I don’t have a good memory.” [10]
People have far better memories than they think they do. Saying you don’t have a good memory is just a convenient excuse to let you forget. Taking a first-principles approach means asking how much information we can physically store in our minds. The answer is “a lot more than you think.” Now that we know it’s possible to put more into our brains, we can reframe the problem into finding the most optimal way to store information in our brains.

“There is too much information out there.”
A lot of professional investors read Farnam Street. When I meet these people and ask how they consume information, they usually fall into one of two categories. The differences between the two apply to all of us. The first type of investor says there is too much information to consume. They spend their days reading every press release, article, and blogger commenting on a position they hold. They wonder what they are missing. The second type of investor realizes that reading everything is unsustainable and stressful and makes them prone to overvaluing information they’ve spent a great amount of time consuming. These investors, instead, seek to understand the variables that will affect their investments. While there might be hundreds, there are usually three to five variables that will really move the needle. The investors don’t have to read everything; they just pay attention to these variables.

“All the good ideas are taken.”
A common way that people limit what’s possible is to tell themselves that all the good ideas are taken. Yet, people have been saying this for hundreds of years — literally — and companies keep starting and competing with different ideas, variations, and strategies.

“We need to move first.”
I’ve heard this in boardrooms for years. The answer isn’t as black and white as this statement. The iPhone wasn’t first, it was better. Microsoft wasn’t the first to sell operating systems; it just had a better business model. There is a lot of evidence showing that first movers in business are more likely to fail than latecomers. Yet this myth about the need to move first continues to exist.

Sometimes the early bird gets the worm and sometimes the first mouse gets killed. You have to break each situation down into its component parts and see what’s possible. That is the work of first-principles thinking.

“I can’t do that; it’s never been done before.”
People like Elon Musk are constantly doing things that have never been done before. This type of thinking is analogous to looking back at history and building, say, floodwalls, based on the worst flood that has happened before. A better bet is to look at what could happen and plan for that.

“As to methods, there may be a million and then some, but principles are few. The man who grasps principles can successfully select his own methods. The man who tries methods, ignoring principles, is sure to have trouble.”

— Harrington Emerson

Conclusion

The thoughts of others imprison us if we’re not thinking for ourselves.

Reasoning from first principles allows us to step outside of history and conventional wisdom and see what is possible. When you really understand the principles at work, you can decide if the existing methods make sense. Often they don’t.

Reasoning by first principles is useful when you are (1) doing something for the first time, (2) dealing with complexity, and (3) trying to understand a situation that you’re having problems with. In all of these areas, your thinking gets better when you stop making assumptions and you stop letting others frame the problem for you.

Analogies can’t replace understanding. While it’s easier on your brain to reason by analogy, you’re more likely to come up with better answers when you reason by first principles. This is what makes it one of the best sources of creative thinking. Thinking in first principles allows you to adapt to a changing environment, deal with reality, and seize opportunities that others can’t see.

Many people mistakenly believe that creativity is something that only some of us are born with, and either we have it or we don’t. Fortunately, there seems to be ample evidence that this isn’t true.[11] We’re all born rather creative, but during our formative years, it can be beaten out of us by busy parents and teachers. As adults, we rely on convention and what we’re told because that’s easier than breaking things down into first principles and thinking for ourselves. Thinking through first principles is a way of taking off the blinders. Most things suddenly seem more possible.

“I think most people can learn a lot more than they think they can,” says Musk. “They sell themselves short without trying. One bit of advice: it is important to view knowledge as sort of a semantic tree — make sure you understand the fundamental principles, i.e., the trunk and big branches, before you get into the leaves/details or there is nothing for them to hang on to.”

***

Members can discuss this on the Learning Community Forum.

End Notes

[1] Aristotle, Physics 184a10–21

[2] Aristotle, Metaphysics 1013a14-15

[3] https://waitbutwhy.com/2015/11/the-cook-and-the-chef-musks-secret-sauce.html

[4] Elon Musk, quoted by Tim Urban in “The Cook and the Chef: Musk’s Secret Sauce,” Wait But Why https://waitbutwhy.com/2015/11/the-cook-and-the-chef-musks-secret-sauce.html

[5] Vance, Ashlee. Elon Musk: Tesla, SpaceX, and the Quest for a Fantastic Future (p. 354)

[6] https://www.wired.com/2012/10/ff-elon-musk-qa/all/

[7] https://www.youtube.com/watch?v=L-s_3b5fRd8

[8] David Rowan, “How BuzzFeed mastered social sharing to become a media giant for a new era,” Wired.com. 2 January 2014. https://www.wired.co.uk/article/buzzfeed

[9] https://www.quora.com/What-does-Elon-Musk-mean-when-he-said-I-think-it%E2%80%99s-important-to-reason-from-first-principles-rather-than-by-analogy/answer/Bruce-Achterberg

[10] https://www.scientificamerican.com/article/new-estimate-boosts-the-human-brain-s-memory-capacity-10-fold/

[11] Breakpoint and Beyond: Mastering the Future Today, George Land

[12] https://www.reddit.com/r/IAmA/comments/2rgsan/i_am_elon_musk_ceocto_of_a_rocket_company_ama/cnfre0a/

[Episode #30] Company Culture, Collaboration and Competition: A Discussion With Margaret Heffernan

On this episode, I’m joined by speaker, international executive, and five-time author Margaret Heffernan. We discuss how to get the most out of our people, creating a thriving culture of trust and collaboration, and how to prevent potentially devastating “willful blindness.”

As former CEO of five successful businesses, Margaret Heffernan has been on the front lines observing the very human tendencies (selective blindness, conflict avoidance, and self-sabotage to name a few) that cause managers and sometimes entire organizations to go astray.

She has since written five books and has spoken all over the world to warn, educate and instruct leaders to not only be aware of these tendencies, but how to weed them out of our companies, our business, and even our relationships.

In this conversation, we discuss many of the concepts she shares in her books, namely:

  • How to tap into the collective knowledge of your organization so problems are solved quickly, efficiently, and cooperatively.
  • The strange experiment Margaret ran to build “social capital” in one of her early businesses that transformed the way her employees treated and interacted with each other
  • How to build a culture that doesn’t create in-fighting and unhealthy competition within your organization, and how many companies today are missing the mark
  • One simple thing you can do as a leader to increase the buy-in, productivity and overall satisfaction of your team members (and it takes less than 30 seconds to do.)
  • The dangers of binary thinking and how Margaret catches herself from oversimplifying a situation.
  • Why arguing may be one of the purest forms of collaboration — and how to do it correctly.
  • How to identify the environment and context where you do your best work and how to best replicate it.
  • How “willful blindness” has caused catastrophic disasters in business, professional and personal relationships, and what we can do to avoid being another statistic
  • The wonderful advice Margaret gave to her kids when it came to choosing a career path

And much more.

If you interact with other human beings in any capacity, you need to hear what Margaret has to say.

Take a listen and let me know what you think!

***

Listen

Transcript
An edited copy of this transcript is available to members of our learning community or for purchase separately ($7).

If you liked this, check out all the episodes of the knowledge project.

***

Members can discuss this on the Learning Community Forum.

Maker vs. Manager: How Your Schedule Can Make or Break You

Consider the daily schedule of famed novelist Haruki Murakami. When he’s working on a novel, he starts his days at 4 am and writes for five or six continuous hours. Once the writing is done, he spends his afternoons running or swimming, and his evenings, reading or listening to music before a 9 pm bedtime. Murakami is known for his strict adherence to this schedule.

In contrast, consider the schedule of entrepreneur, speaker, and writer Gary Vaynerchuk. He describes his day (which begins at 6 am) as being broken into tiny slots, mostly comprising meetings which can be as short as three minutes. He makes calls in between meetings. During the moments between meetings and calls, he posts on just about every social network in existence and records short segments of video or speech. In short, his day, for the most part, involves managing, organizing, and instructing other people, making decisions, planning, and advising.

“How we spend our days is, of course, how we spend our lives. What we do with this hour, and that one, is what we are doing. A schedule defends from chaos and whim. It is a net for catching days. It is a scaffolding on which a worker can stand and labor with both hands at sections of time.”

— Annie Dillard, The Writing Life

The numerous articles we have all read about the schedules and routines of successful people like these often miss the point. Getting up at 4 am does not make someone an acclaimed novelist, any more than splitting the day into 15-minute segments makes someone an influential entrepreneur.

What we can learn from reading about the schedules of people we admire is not what time to set our alarms or how many cups of coffee to drink, but that different types of work require different types of schedules. The two wildly different workdays of Murakami and Vaynerchuk illustrate the concept of maker and manager schedules.

Paul Graham of Y Combinator first described this concept in a 2009 essay. From Graham’s distinction between makers and managers, we can learn that doing creative work or overseeing other people does not necessitate certain habits or routines. It requires consideration of the way we structure our time.

What’s the Difference?

A manager’s day is, as a rule, sliced up into tiny slots, each with a specific purpose decided in advance. Many of those slots are used for meetings, calls, or emails. The manager’s schedule may be planned for them by a secretary or assistant.

Managers spend a lot of time “putting out fires” and doing reactive work. An important call or email comes in, so it gets answered. An employee makes a mistake or needs advice, so the manager races to sort it out. To focus on one task for a substantial block of time, managers need to make an effort to prevent other people from distracting them.

Managers don’t necessarily need the capacity for deep focus — they primarily need the ability to make fast, smart decisions. In a three-minute meeting, they have the potential to generate (or destroy) enormous value through their decisions and expertise.

A maker’s schedule is different. It is made up of long blocks of time reserved for focusing on particular tasks, or the entire day might be devoted to one activity. Breaking their day up into slots of a few minutes each would be the equivalent of doing nothing.

A maker could be the stereotypical reclusive novelist, locked away in a cabin in the woods with a typewriter, no internet, and a bottle of whiskey to hand. Or they could be a Red Bull–drinking Silicon Valley software developer working in an open-plan office with their headphones on. Although interdisciplinary knowledge is valuable, makers do not always need a wide circle of competence. They need to do one thing well and can leave the rest to the managers.

Meetings are pricey for makers, restricting the time available for their real work, so they avoid them, batch them together, or schedule them at times of day when their energy levels are low. As Paul Graham writes:

When you’re operating on the maker’s schedule, meetings are a disaster. A single meeting can blow a whole afternoon, by breaking it into two pieces each too small to do anything hard in. Plus you have to remember to go to the meeting. That’s no problem for someone on the manager’s schedule. There’s always something coming on the next hour; the only question is what. But when someone on the maker’s schedule has a meeting, they have to think about it.

It makes sense. The two work styles could not be more different.

A manager’s job is to, well, manage other people and systems. The point is that their job revolves around organizing other people and making decisions. As Andrew Grove writes in High Output Management:

…a big part of a middle manager’s work is to supply information and know-how, and to impart a sense of the preferred method of handling things to the groups under his control and influence. A manager also makes and helps to make decisions. Both kinds of basic managerial tasks can only occur during face-to-face encounters, and therefore only during meetings. Thus, I will assert again that a meeting is nothing less than the medium through which managerial work is performed. That means we should not be fighting their very existence, but rather using the time spent in them as efficiently as possible.

A maker’s job is to create some form of tangible value. Makers work alone or under a manager, although they might have people working with them. “Maker” is a very broad category. A maker could be a writer, artist, software developer, carpenter, chef, biohacker, web designer, or anyone else who designs, creates, serves, and thinks.

Making anything significant requires time — lots of it — and having the right kind of schedule can help. Take a look at the quintessential maker schedule of the prolific (to say the least) writer Isaac Asimov, as described in his memoir:

I wake at five in the morning. I get to work as early as I can. I work as long as I can. I do this every day of the week, including holidays. I don’t take vacations voluntarily and I try to do my work even when I’m on vacation. (And even when I’m in the hospital.)

In other words, I am still and forever in the candy store [where he worked as a child]. Of course, I’m not waiting on customers; I’m not taking money and making change; I’m not forced to be polite to everyone who comes in (in actual fact, I was never good at that). I am, instead, doing things I very much want to do — but the schedule is there; the schedule that was ground into me; the schedule you would think I would have rebelled against once I had the chance.

The Intersection Between Makers and Managers

It is far from unusual for a person’s job to involve both maker and manager duties. Elon Musk is one example. His oft-analyzed schedule involves a great deal of managing as the head of multiple major companies, but he also spends an estimated 80% of his time on designing and engineering. How does he achieve this? Judging from interviews, Musk is adept at switching between the two schedules, planning his day in five-minute slots during the managerial times and avoiding calls or emails during the maker times.

The important point to note is that people who successfully combine both schedules do so by making a clear distinction, setting boundaries for those around them, and adjusting their environment in accordance. They don’t design for an hour, have meetings for an hour, then return to designing, and so on. In his role as an investor and adviser to startups, Paul Graham sets boundaries between his two types of work:

How do we manage to advise so many startups on the maker’s schedule? By using the classic device for simulating the manager’s schedule within the maker’s: office hours. Several times a week I set aside a chunk of time to meet founders we’ve funded. These chunks of time are at the end of my working day, and I wrote a signup program that ensures [that] all the appointments within a given set of office hours are clustered at the end. Because they come at the end of my day these meetings are never an interruption. (Unless their working day ends at the same time as mine, the meeting presumably interrupts theirs, but since they made the appointment it must be worth it to them.) During busy periods, office hours sometimes get long enough that they compress the day, but they never interrupt it.

Likewise, during his time working on his own startup, Graham figured out how to partition his day and get both categories of work done without sacrificing his sanity:

When we were working on our own startup, back in the ’90s, I evolved another trick for partitioning the day. I used to program from dinner till about 3am every day, because at night no one could interrupt me. Then, I’d sleep till about 11am, and come in and work until dinner on what I called “business stuff.” I never thought of it in these terms, but in effect I had two workdays each day, one on the manager’s schedule and one on the maker’s.

Murakami also combined making and managing during his early days as a novelist. As with many other makers, his creative work began as a side project while he held another job. Murakami ran a jazz club. In a 2008 New Yorker profile, Murakami described having a schedule similar to Graham’s in his days running a startup. He spent his days overseeing the jazz club — doing paperwork, organizing staff, keeping track of the inventory, and so on. When the club closed after midnight, Murakami started writing and continued until he was exhausted. After reaching a tipping point with his success as a writer, Murakami made the switch from combining maker and manager schedules to focusing on the former.

In Deep Work, Cal Newport describes the schedule of another person who combines both roles, Wharton professor (and our podcast guest) Adam Grant.

To produce at your peak level you need to work for extended periods with full concentration on a single task free from distraction. Though Grant’s productivity depends on many factors, there’s one idea in particular that seems central to his method: the batching of hard but important intellectual work into long, uninterrupted stretches. Grant performs this batching at multiple levels. Within the year, he stacks his teaching into the fall semester, during which he can turn all of his attention to teaching well and being available to his students. (This method seems to work, as Grant is currently the highest-rated teacher at Wharton and the winner of multiple teaching awards.)

During the fall semester, Grant is in manager mode and has meetings with students. For someone in a teaching role, a maker schedule would be impossible. Teachers need to be able to help and advise their students. In the spring and summer, Grant switches to a maker schedule to focus on his research. He avoids distractions by being — at least, in his mind — out of his office.

Within a semester dedicated to research, he alternates between periods where his door is open …, and periods where he isolates himself to focus completely and without distraction on a single research task. (He typically divides the writing of a scholarly paper into three discrete tasks: analyzing the data, writing a full draft, and editing the draft into something publishable.) During these periods, which can last up to three or four days, he’ll often put an out-of-office auto-responder on his e-mail so correspondents will know not to expect a response. “It sometimes confuses my colleagues,” he told me. “They say, ‘You’re not out of office, I see you in your office right now!’” But to Grant, it’s important to enforce strict isolation until he completes the task at hand.

“A woodpecker can tap twenty times on a thousand trees and get nowhere, but stay busy. Or he can tap twenty-thousand times on one tree and get dinner.”

— Seth Godin, The Dip

The Value of Defining Your Schedule

We all know the benefits of a solid routine — it helps us to work smarter, look after our health, plan the trajectory of our days, achieve goals, and so on. That has all been discussed a million times and doubtless will be discussed a million more. But how often do we think about how our days are actually broken up, about how we choose (or are forced) to segment them? If you consider yourself a maker, do you succeed in structuring your day around long blocks of focused work, or does it get chopped up into little slices that other people can grab? If you regard yourself as a manager, are you available for the people who need your time? Are those meetings serving a purpose and getting high-leverage work done, or are you just trying to fill up an appointment book? If you do both types of work, how do you draw a line between them and communicate that boundary to others?

Cal Newport writes:

We spend much of our days on autopilot—not giving much thought to what we are doing with our time. This is a problem. It’s difficult to prevent the trivial from creeping into every corner of your schedule if you don’t face, without flinching, your current balance between deep and shallow work, and then adopt the habit of pausing before action and asking, “What makes the most sense right now?”

There are two key reasons that the distinction between maker and manager schedules matters for each of us and the people we work with.

First, defining the type of schedule we need is more important than worrying about task management systems or daily habits. If we try to do maker work on a manager schedule or managerial work on a maker schedule, we will run into problems.

Second, we need to be aware of which schedule the people around us are on so we can be considerate and let them get their best work done.

We shouldn’t think of either type of work as superior, as the two are interdependent. Managers would be useless without makers and vice versa. It’s the clash which can be problematic. Paul Graham notes that some managers damage their employees’ productivity when they fail to recognize the distinction between the types of schedules. Managers who do recognize the distinction will be ahead of the game. As Graham writes:

Each type of schedule works fine by itself. Problems arise when they meet. Since most powerful people operate on the manager’s schedule, they’re in a position to make everyone resonate at their frequency if they want to. But the smarter ones restrain themselves, if they know that some of the people working for them need long chunks of time to work in.

Makers generally avoid meetings and similar time-based commitments that don’t have a direct impact on their immediate work. A 30-minute meeting does not just take up half an hour of an afternoon. It bisects the day, creating serious problems. Let’s say that a computer programmer has a meeting planned at 2 pm. When they start working in the morning, they know they have to stop later and are prevented from achieving full immersion in the current project. As 2 pm rolls around, they have to pause whatever they are doing — even if they are at a crucial stage — and head to the meeting. Once it finishes and they escape back to their real work, they experience attention residue and the switching costs of moving between tasks. It takes them a while — say, 15 to 20 minutes — to reach their prior state of focus. Taking that into account, the meeting has just devoured at least an hour of their time. If it runs over or if people want to chat afterwards, the effect is even greater. And what if they have another meeting planned at 4 pm? That leaves them with perhaps an hour to work, during which they keep an eye on the clock to avoid being late.

Software entrepreneur Ray Ozzie has a specific technique for handling potential interruptions — the four-hour rule. When he’s working on a product, he never starts unless he has at least four uninterrupted hours to focus on it. Fractured blocks of time, he discovered, result in more bugs, which later require fixing.

In Quiet: The Power of Introverts in a World That Can’t Stop Talking, Susan Cain describes an experiment to figure out the characteristics of superior programmers:

…more than six hundred developers from ninety-two different companies participated. Each designed, coded, and tested a program, working in his normal office space during business hours. Each participant was also assigned a partner from the same company. The partners worked separately, however, without any communication, a feature of the games that turned out to be critical.

When the results came in, they revealed an enormous performance gap. The best outperformed the worst by a 10:1 ratio. The top programmers were also about 2.5 times better than the median. When DeMarco and Lister tried to figure out what accounted for this astonishing range, the factors that you’d think would matter—such as years of experience, salary, even the time spent completing the work—had little correlation to outcome. Programmers with ten years’ experience did no better than those with two years. The half who performed above the median earned less than 10 percent more than the half below—even though they were almost twice as good. The programmers who turned in “zero-defect” work took slightly less, not more, time to complete the exercise than those who made mistakes.

It was a mystery with one intriguing clue: programmers from the same companies performed at more or less the same level, even though they hadn’t worked together. That’s because top performers overwhelmingly worked for companies that gave their workers the most privacy, personal space, control over their physical environments, and freedom from interruption. Sixty-two percent of the best performers said that their workspace was acceptably private, compared to only 19 percent of the worst performers; 76 percent of the worst performers but only 38 percent of the top performers said that people often interrupted them needlessly.

A common argument makers hear from people on a different schedule is that they should “just take a break for this!” — “this” being a meeting, call, coffee break, and so on. But a distinction exists between time spent not doing their immediate work and time spent taking a break.

Pausing to drink some water, stretch, or get fresh air is the type of break that recharges makers and helps them focus better when they get back to work. Pausing to hear about a coworker’s marital problems or the company’s predictions for the next quarter has the opposite effect. A break and time spent not working are very different. One fosters focus, the other snaps it.

Remember Arnold Bennett’s words: “You have to live on this 24 hours of time. Out of it you have to spin health, pleasure, money, content, respect and the evolution of your immortal soul. Its right use … is a matter of the highest urgency.”

Complex Adaptive Cities

Complex adaptive systems are hard to understand. Messy and complicated, they cannot be broken down into smaller bits. It would be easier to ignore them, or simply leave them as mysteries. But given that we are living in one such system, it might be more useful to buckle down and sort it out. That way, we can make choices that are aligned with how the world actually operates.

In his book Diversity and Complexity, Scott E. Page explains, “Complexity can be loosely thought of as interesting structures and patterns that are not easily described or predicted. Systems that produce complexity consist of diverse rule-following entities whose behaviors are interdependent. Those entities interact over a contact structure or network. In addition, the entities often adapt.”

Understanding complexity is important, because sometimes things are not further reducible. While the premise of Occam’s Razor is that things should be made as simple as possible but not simpler, sometimes there are things that cannot be reduced. There is, in fact, an irreducible minimum. Certain things can be properly contemplated only in all their complicated, interconnected glory.

Take, for example, cities.

Cities cannot be created for success from the top down by the imposition of simple rules.

For those of us who live in cities, we all know what makes a particular neighborhood great. We can get what we need and have the interactions we want, and that’s ultimately because we feel safe there.

But how is this achieved? What magic combination of people and locations, uses and destinations, makes a vibrant, safe neighborhood? Is there a formula for, say, the ratio of houses to businesses, or of children to workers?

No. Cities are complex adaptive systems. They cannot be created for success from the top down by the imposition of simple rules.

In her seminal book The Death and Life of Great American Cities, Jane Jacobs approached the city as a complex adaptive system, turned city planning on its head, and likely saved many North American cities by taking them apart and showing that they cannot be reduced to a series of simple behavioral interactions.

Cities fall exactly into the definition of complexity given above by Page. They are full of rule-following humans, cars, and wildlife, the behaviors of which are interdependent on the other entities and respond to feedback.

These components of a city interact over multiple interfaces in a city network and will adapt easily, changing their behavior based on food availability, road closures, or perceived safety. But the city itself cannot be understood by looking at just one of these behaviors.

Jacobs starts with “the kind of problem which cities pose — a problem in handling organized complexity” — and a series of observations about that common, almost innocuous, part of all cities: the sidewalk.

What makes a particular neighborhood safe?

Jacobs argues that there is no one factor but rather a series of them. In order to understand how a city street can be safe, you must examine the full scope of interactions that occur on its sidewalk. “The trust of a city street is formed over time from many, many little public sidewalk contacts.” Nodding to people you know, noticing people you don’t. Recognizing which parent goes with which kid, or whose business seems to be thriving. People create safety.

Given that most of them are strangers to each other, how do they do this? How come these strangers are not all perceived as threats?

Safe streets are streets that are used by many different types of people throughout the 24-hour day. Children, workers, caregivers, tourists, diners — the more people who use the sidewalk, the more eyes that participate in the safety of the street.

Safety on city streets is “kept primarily by an intricate, almost unconscious, network of voluntary controls and standards among the people themselves, and enforced by the people themselves.” Essentially, we all contribute to safety because we all want safety. It increases our chances of survival.

Jacobs brings an amazing eye for observational detail in describing neighborhoods that work and those that don’t. In describing sidewalks, she explains that successful, safe neighborhoods are orderly. “But there is nothing simple about that order itself, or the bewildering number of components that go into it. Most of those components are specialized in one way or another. They unite in their joint effect upon the sidewalk, which is not specialized in the least. That is its strength.” For example, restaurant patrons, shopkeepers, loitering teenagers, etc. — some of whom belong to the area and some of whom are transient — all use the sidewalk and in doing so contribute to the interconnected and interdependent relationships that produce the perception of safety on that street. And real safety will follow perceived safety.

To get people participating in this unorganized street safety, you have to have streets that are desirable. “You can’t make people use streets they have no reason to use. You can’t make people watch streets they do not want to watch.” But Jacobs points out time and again that there is no predictable prescription for how to achieve this mixed use where people are unconsciously invested in the maintenance of safety.

This is where considering the city as a complex adaptive system is most useful.

Each individual component has a part to play, so a top-down imposition of theory that doesn’t allow for the unpredictable behavior of each individual is doomed to fail. “Orthodox planning is much imbued with puritanical and Utopian conceptions of how people should spend their free time, and in planning, these moralisms on people’s private lives are deeply confused with concepts about the workings of cities.” A large, diverse group of people is not going to conform to only one way of living. And it’s the diversity that offers the protection.

For example, a city planner might decide to not have bars in residential neighborhoods. The noise might keep people up, or there will be a negative moral impact on the children who are exposed to the behavior of loud, obnoxious drunks. But as Jacobs reveals, safe city areas can’t be built on the basis of this type of simplistic assumption.

By stretching the use of a street through as many hours of the day as possible, you might create a safer neighborhood. I say “might” because in this complex system, other factors might connect to manifest a different reality.

Planning that doesn’t respect the spectrum of diverse behavior and instead aims to insist on an ideal based on a few simple concepts will hinder the natural ability of a system to adapt.

As Scott Page explains, “Creating a complex system from scratch takes skill (or evolution). Therefore, when we see diverse complex systems in the real world, we should not assume that they’ve been assembled from whole cloth. Far more likely, they’ve been constructed bit by bit.”

Urban planning that doesn’t respect the spectrum of diverse behavior and instead aims to insist on an ideal based on a few simple concepts (fresh air, more public space, large private space) will hinder the natural ability of a city system to adapt in a way that suits the residents. And it is this ability to adapt that is the cornerstone requirement of this type of complex system. Inhibit the adaptive property and you all but ensure the collapse of the system.

As Jacobs articulates:

Under the seeming disorder of the old city, wherever the old city is working successfully, is a marvelous order for maintaining the safety of the streets and the freedom of the city. It is a complex order. Its essence is intricacy of sidewalk use, bringing with it a constant succession of eyes. This order is all composed of movement and change, and although it is life, not art, we may fancifully call it the art form of the city and liken it to the dance — … to an intricate ballet in which the individual dancers and ensembles all have distinctive parts which miraculously reinforce each other and compose an orderly whole. The ballet of the good city sidewalk never repeats itself from place to place, and in any one place is always replete with new improvisations.

This is the essence of complexity. As Scott Page argues, “Adaptation occurs at the level of individuals or of types. The system itself doesn’t adapt. The parts do; they alter their behaviors leading to system level adaptation.”

Jacobs maintains that “the sight of people attracts still other people.” We feel more secure when we know there are multiple eyes on us, eyes that are concerned only with the immediate function that might affect them and are not therefore invasive.

Our complex behavior as individuals in cities, interacting with various components in any given day, is multiplied by everyone, so a city that produces a safe environment seems to be almost miraculous. But ultimately our behavior is governed by certain rules — not rules that are imposed by theory or external forces, but rules that we all feel are critical to our well-being and success in our city.

Thus, the workings of a desirable city are produced by a multitude of small interactions that have evolved and adapted as they have promoted the existence of the things that most support the desires of individuals.

“The look of things and the way they work are inextricably bound together, and in no place more so than cities,” claims Jacobs. Use is not independent of form. That is why we must understand the system as a whole. No matter how many components and unpredictable potential interactions there are, they are all part of what makes the city function.

As Jacobs concludes, “There is no use wishing it were a simpler problem, because in real life it is not a simpler problem. No matter what you try to do to it, a city park behaves like a problem in organized complexity, and that is what it is. The same is true of all other parts or features of cities. Although the inter-relations of their many factors are complex, there is nothing accidental or irrational about the ways in which these factors affect each other.”

Critical Mass and Tipping Points: How To Identify Inflection Points Before They Happen

Critical mass, which is sometimes referred to as tipping points, is one of the most effective mental models you can use to understand the world. The concept can explain everything from viral cat videos to why changing habits is so hard.

 

The Basics

Sometimes it can seem as if drastic changes happen at random.

One moment a country is stable; the next, a revolution begins and the government is overthrown. One day a new piece of technology is a novelty; the next, everyone has it and we cannot imagine life without it. Or an idea lingers at the fringes of society before it suddenly becomes mainstream.

As erratic and unpredictable as these occurrences are, there is a logic to them, which can be explained by the concept of critical mass. A collaboration between Thomas Schelling (a game theorist) and Mark Granovetter (a sociologist) led to the concept’s being identified in 1971.

Also known as the boiling point, the percolation threshold, the tipping point, and a host of other names, critical mass is the point at which something (an idea, belief, trend, virus, behavior, etc.) is prevalent enough to grow, or sustain, a process, reaction, or technology.

As a mental model, critical mass can help us to understand the world around us by letting us spot changes before they occur, make sense of tumultuous times, and even gain insight into our own behaviors. A firm understanding can also give us an edge in launching products, changing habits, and choosing investments.

In The Decision Book, Mikael Krogerus wrote of technological critical masses:

Why is it that some ideas – including stupid ones – take hold and become trends, while others bloom briefly before withering and disappearing from the public eye?

… Translated into a graph, this development takes the form of a curve typical of the progress of an epidemic. It rises, gradually at first, then reaches the critical point of any newly launched product, when many products fail. The critical point for any innovation is the transition from the early adapters to the sceptics, for at this point there is a ‘chasm’. …

With technological innovations like the iPod or the iPhone, the cycle described above is very short. Interestingly, the early adaptors turn away from the product as soon as the critical masses have accepted it, in search of the next new thing.

In Developmental Evaluation, Michael Quinn Patton wrote:

Complexity theory shows that great changes can emerge from small actions. Change involves a belief in the possible, even the “impossible.” Moreover, social innovators don’t follow a linear pathway of change; there are ups and downs, roller-coaster rides along cascades of dynamic interactions, unexpected and unanticipated divergences, tipping points and critical mass momentum shifts. Indeed, things often get worse before they get better as systems change creates resistance to and pushback against the new.

In If Nobody Speaks of Remarkable Things, Jon McGregor writes a beautiful explanation of how the concept of critical mass applies to weather:

He wonders how so much water can resist the pull of so much gravity for the time it takes such pregnant clouds to form, he wonders about the moment the rain begins, the turn from forming to falling, that slight silent pause in the physics of the sky as the critical mass is reached, the hesitation before the first swollen drop hurtles fatly and effortlessly to the ground.

Critical Mass in Physics

In nuclear physics, critical mass is defined as the minimum amount of a fissile material required to create a self-sustaining fission reaction. In simpler terms, it’s the amount of reactant necessary for something to happen and to keep happening.

This concept is similar to the mental model of activation energy. The exact critical mass depends on the nuclear properties of a material, its density, its shape, and other factors.

In some nuclear reactions, a reflector made of beryllium is used to speed up the process of reaching critical mass. If the amount of fissile material is inadequate, it is referred to as a subcritical mass. Once the rate of reaction is increasing, the amount of material is referred to as a supercritical mass. This concept has been taken from physics and used in many other disciplines.

Critical Mass in Sociology

In sociology, a critical mass is a term for a group of people who make a drastic change, altering their behavior, opinions or actions.

“When enough people (a critical mass) think about and truly consider the plausibility of a concept, it becomes reality.”

—Joseph Duda

In some societies (e.g., a small Amazonian tribe), just a handful of people can change prevailing views. In larger societies (in particular, those which have a great deal of control over people, such as North Korea), the figure must usually be higher for a change to occur.

The concept of a sociological critical mass was first used in the 1960s by Morton Grodzins, a political science professor at the University of Chicago. Grodzins studied racial segregation — in particular, examining why people seemed to separate themselves by race even when that separation was not enforced by law. His hypothesis was that white families had different levels of tolerance for the number of people of racial minorities in their neighborhoods. Some white families were completely racist; others were not concerned with the race of their neighbors. As increasing numbers of racial minorities moved into neighborhoods, the most racist people would soon leave. Then a tipping point would occur — a critical mass of white people would leave until the area was populated by racial minorities. This phenomenon became known as “white flight.”

Critical Mass in Business

In business, at a macro level, critical mass can be defined as the time when a company becomes self-sustaining and is economically viable. (Please note that there is a difference between being economically viable and being profitable.) Just as a nuclear reaction reaches critical mass when it can sustain itself, so must a business. It is important, too, that a business chooses its methods for growth with care: sometimes adding more staff, locations, equipment, stock, or other assets can be the right choice; at other times, these additions can lead to negative cash flow.

The exact threshold and time to reach critical mass varies widely, depending on the industry, competition, startup costs, products, and other economic factors.

Bob Brinker, host of Money Talk, defines critical mass in business as:

A state of freedom from worry and anxiety about money due to the accumulation of assets which make it possible to live your life as you choose without working if you prefer not to work or just working because you enjoy your work but don’t need the income. Plainly stated, the Land of Critical Mass is a place in which individuals enjoy their own personal financial nirvana. Differentiation between earned income and assets is a fundamental lesson to learn when thinking in terms of critical mass. Earned income does not produce critical mass … critical mass is strictly a function of assets.

Independence or “F*** You” Money

Most people work jobs and get paychecks. If you depend on a paycheck, like most of us, this means you are not independent — you are not self-sustaining. Once you have enough money, you can be self-sustaining.

If you were wealthy enough to be free, would you really keep the job you have now? How many of us check our opinions or thoughts before voicing them because we know they won’t be acceptable? How many times have you agreed to work on a project that you know is doomed, because you need the paycheck?

“Whose bread I eat: his song I sing.”

—Proverb

In his book The Black Swan, Nassim Taleb describes “f*** you” money, which, “in spite of its coarseness, means that it allows you to act like a Victorian gentleman, free from slavery”:

It is a psychological buffer: the capital is not so large as to make you spoiled-rich, but large enough to give you the freedom to choose a new occupation without excessive consideration of the financial rewards. It shields you from prostituting your mind and frees you from outside authority — any outside authority. … Note that the designation f*** you corresponds to the exhilarating ability to pronounce that compact phrase before hanging up the phone.

Critical Mass in Psychology

Psychologists have known for a long time that groups of people behave differently than individuals.

Sometimes when we are in a group, we tend to be less inhibited, more rebellious, and more confident. This effect is known as mob behavior. (An interesting detail is that mob psychology is one of the few branches of psychology which does not concern individuals.) As a general rule, the larger the crowd, the less responsibility people have for their behavior. (This is also why individuals and not groups should make decisions.)

“[Groups of people] can be considered to possess agential capabilities: to think, judge, decide, act, reform; to conceptualize self and others as well as self’s actions and interactions; and to reflect.”

—Burns and Engdahl

Gustav Le Bon is one psychologist who looked at the formation of critical masses of people necessary to spark change. According to Le Bon, this formation creates a collective unconsciousness, making people “a grain of sand amid other grains of sand which the wind stirs up at will.”

He identified three key processes which create a critical mass of people: anonymity, contagion, and suggestibility. When all three are present, a group loses their sense of self-restraint and behaves in a manner he considered to be more primitive than usual. The strongest members (often those who first convinced others to adopt their ideas) have power over others.

Examples of Critical Mass

Virality

Viral media include forms of content (such as text, images, and videos) which are passed amongst people and often modified along the way. We are all familiar with how memes, videos and jokes spread on social media. The term “virality” comes from the similarity to how viruses propagate.

“We are all susceptible to the pull of viral ideas. Like mass hysteria. Or a tune that gets into your head that you keep on humming all day until you spread it to someone else. Jokes. Urban legends. Crackpot religions. No matter how smart we get, there is always this deep irrational part that makes us potential hosts for self-replicating information.”

—Neal Stephenson, Snow Crash

In The Selfish Gene, Richard Dawkins compared memes to human genes. While the term “meme” is now, for the most part, used to describe content that is shared on social media, Dawkins described religion and other cultural objects as memes.

The difference between viral and mainstream media is that the former is more interactive and is shaped by the people who consume it. Gatekeeping and censorship are also less prevalent. Viral content often reflects dominant values and interests, such as kindness (for example, the dancing-man video) and humor. The importance of this form of media is apparent when it is used to negatively impact corporations or powerful individuals (such as the recent United Airlines and Pepsi fiascoes.)

Once a critical mass of people share and comment on a piece of content online, it reaches viral status. Its popularity then grows exponentially before it fades away a short time later.

Technology

The concept of critical mass is crucial when it comes to the adoption of new technology. Every piece of technology which is now (or once was) a ubiquitous part of our lives was once new and novel.

Most forms of technology become more useful as more people adopt them. There is no point in having a telephone if it cannot be used to call other people. There is no point in having an email account if it cannot be used to email other people.

The value of networked technology increases as the size of the network itself does. Eventually, the number of users reaches critical mass, and not owning that particular technology becomes a hindrance. Useful technology tends to lead the first adopters to persuade those around them to try it, too. As a general rule, the more a new technology depends upon a network of users, the faster it will reach critical mass. This situation creates a positive feedback loop.

In Zero to One, Peter Thiel describes how PayPal achieved the critical mass of users needed for it to be useful:

For PayPal to work, we needed to attract a critical mass of at least a million users. Advertising was too ineffective to justify the cost. Prospective deals with big banks kept falling through. So we decided to pay people to sign up.

We gave new customers $10 for joining, and we gave them $10 more every time they referred a friend. This got us hundreds of thousands of new customers and an exponential growth rate.

Another illustration of the importance of critical mass for technology (and the unique benefits of crowdfunding) comes from Chris LoPresti:

A friend of mine raised a lot of money to launch a mobile app; however, his app was trounced by one from another company that had raised a tenth of what he had, but had done so through 1,000 angels on Kickstarter. Those thousand angels became the customers and evangelists that provided the all-important critical mass early on. Any future project I do, I’ll do through Kickstarter, even if I don’t need the money.

Urban Legends

Urban legends are an omnipresent part of society, a modern evolution of traditional folklore. They tend to involve references to deep human fears and popular culture. Whereas traditional folklore was often full of fantastical elements, modern urban legends are usually a twist on reality. They are intended to be believed and passed on. Sociologists refer to them as “contemporary legends.” Some can survive for decades, being modified as time goes by and spreading to different areas and groups. Researchers who study urban legends have noted that many do have vague roots in actual events, and are just more sensationalized than the reality.

One classic urban legend is “The Hook.” This story has two key elements: a young couple parked in a secluded area and a killer with a hook for a hand. The radio in their car announces a serial killer on the loose, often escaped from a nearby institution, with a hook for a hand. In most versions, the couple panics and drives off, only to later find a hook hanging from the car door handle. In others, the man leaves the car while the woman listens to the radio bulletin. She keeps hearing a thumping sound on the roof of the car. When she exits to investigate, the killer is sitting on the roof, holding the man’s severed head. The origins of this story are unknown, although it first emerged in the 1950s in America. By 1960, it began to appear in publications.

Urban legends are an example of how a critical mass of people must be reached before an idea can spread. While the exact origins are rarely clear, it is assumed that it begins with a single person who misunderstands a news story or invents one and passes it on to others, perhaps at a party.

Many urban legends have a cautionary element, so they may first be told in an attempt to protect someone. “The Hook” has been interpreted as a warning to teenagers engaging in promiscuous behavior. When this story is looked at by Freudian folklorists, the implications seem obvious. It could even have been told by parents to their children.

This cautionary element is clear in one of the first printed versions of “The Hook” in 1960:

If you are interested in teenagers, you will print this story. I do not know whether it’s true or not, but it does not matter because it served its purpose for me… I do not think I will ever park to make out as long as I live. I hope this does the same for other kids.

Once a critical mass of people knows an urban legend, the rate at which it spreads grows exponentially. The internet now enables urban legends (and everything else) to pass between people faster. Although a legend might also be disproved faster, that’s a complicated mess. For now, as Lefty says in Donnie Brasco, “Forget about it.”

The more people who believe a story, the more believable it seems. This effect is exacerbated when media outlets or local police fall for the legends and issue warnings. Urban legends often then appear in popular culture (for example, “The Hook” inspired a Supernatural episode) and become part of our modern culture. The majority of people stop believing them, yet the stories linger in different forms.

Changes in Governments and Revolutions

From a distance, it can seem shocking when the people of a country revolt and overthrow dominant powers in a short time.

What is it that makes this sudden change happen? The answer is the formation of a critical mass of people necessary to move marginal ideas to a majority consensus. Pyotr Kropotkin wrote:

Finally, our studies of the preparatory stages of all revolutions bring us to the conclusion that not a single revolution has originated in parliaments or in any other representative assembly. All began with the people. And no revolution has appeared in full armor — born, like Minerva out of the head of Jupiter, in a day. They all had their periods of incubation during which the masses were very slowly becoming imbued with the revolutionary spirit, grew bolder, commenced to hope, and step by step emerged from their former indifference and resignation. And the awakening of the revolutionary spirit always took place in such a manner that at first, single individuals, deeply moved by the existing state of things, protested against it, one by one. Many perished, “uselessly,” the armchair critic would say. But the indifference of society was shaken by these progenitors. The dullest and most narrow-minded people were compelled to reflect, “Why should men, young, sincere, and full of strength, sacrifice their lives in this way?” It was impossible to remain indifferent; it was necessary to take a stand, for, or against: thought was awakening. Then, little by little, small groups came to be imbued with the same spirit of revolt; they also rebelled — sometimes in the hope of local success — in strikes or in small revolts against some official whom they disliked, or in order to get food for their hungry children, but frequently also without any hope of success: simply because the conditions grew unbearable. Not one, or two, or tens, but hundreds of similar revolts have preceded and must precede every revolution.

When an oppressive regime is in power, a change is inevitable. However, it is almost impossible to predict when that change will occur. Often, a large number of people want change and yet fear the consequences or lack the information necessary to join forces. When single individuals act upon their feelings, they are likely to be punished without having any real impact. Only when a critical mass of people’s desire for change overwhelms their fear can a revolution occur. Other people are encouraged by the first group, and the idea spreads rapidly.

One example occurred in China in 1989. While the desire for change was almost universal, the consequences felt too dire. When a handful of students protested for reform in Beijing, authorities did not punish them. We have all seen the classic image of a lone student, shopping bags in hand, standing in front of a procession of tanks and halting them. Those few students who protested were the critical mass. Demonstrations erupted in more than 300 towns all over the country as people found the confidence to act.

Malcolm Gladwell on Tipping Points

An influential text on the topic of critical mass is Malcolm Gladwell’s The Tipping Point. Published in 2000, the book describes a tipping point as “the moment of critical mass, the threshold, the boiling point.” He notes that “Ideas and products and messages and behaviors spread just like viruses do” and cites such examples as the sudden popularity of Hush Puppies and the steep drop in crime in New York after 1990. Gladwell writes that although the world “may seem like an immovable, implacable place,” it isn’t. “With the slightest push — in just the right place — it can be tipped.”

Referring to the 80/20 rule (also known as Pareto’s principle), Gladwell explains how it takes a tiny number of people to kickstart the tipping point in any sort of epidemic:

Economists often talk about the 80/20 Principle, which is the idea that in any situation roughly 80 percent of the “work” will be done by 20 percent of the participants. In most societies, 20 percent of criminals commit 80 percent of crimes. Twenty percent of motorists cause 80 percent of all accidents. Twenty percent of beer drinkers drink 80 percent of all beer. When it comes to epidemics, though, this disproportionality becomes even more extreme: a tiny percentage of people do the majority of the work.

Rising crime rates are also the result of a critical mass of people who see unlawful behavior as justified, acceptable, or necessary. It takes only a small number of people who commit crimes for a place to seem dangerous and chaotic. Gladwell explains how minor transgressions lead to more serious problems:

[T]he Broken Windows theory … was the brainchild of the criminologist James Q. Wilson and George Kelling. Wilson and Kelling argued that crime is the inevitable result of disorder. If a window is broken and left unrepaired, people walking by will conclude that no one cares and no one is in charge. Soon, more windows will be broken, and the sense of anarchy will spread from the building to the street it faces, sending a signal that anything goes. In a city, relatively minor problems like graffiti, public disorder, and aggressive panhandling, they write, are all the equivalent of broken windows, invitations to more serious crimes…

According to Gladwell’s research, there are three main factors in the creation of a critical mass of people necessary to induce a sudden change.

The first of these is the Law of the Few. Gladwell states that certain categories of people are instrumental in the creation of tipping points. These categories are:

  • Connectors: We all know connectors. These are highly gregarious, sociable people with large groups of friends. Connectors are those who introduce us to other people, instigate gatherings, and are the fulcrums of social groups. Gladwell defines connectors as those with networks of over one hundred people. An example of a cinematic connector is Kevin Bacon. There is a trivia game known as “Six Degrees of Kevin Bacon,” in which players aim to connect any actor/actress to him within a chain of six films. Gladwell writes that connectors have “some combination of curiosity, self-confidence, sociability, and energy.”
  • Mavens: Again, we all know a maven. This is the person we call to ask what brand of speakers we should buy, or which Chinese restaurant in New York is the best, or how to cope after a rough breakup. Gladwell defines mavens as “people we rely upon to connect us with new information.” These people help create a critical mass due to their habit of sharing information, passing knowledge on through word of mouth.
  • Salesmen: Whom would you call for advice about negotiating a raise, a house price, or an insurance payout? That person who just came to mind is probably what Gladwell calls a salesman. These are charismatic, slightly manipulative people who can persuade others to accept what they say.

The second factor cited by Gladwell is the “stickiness factor.” This is what makes a change significant and memorable. Heroin is sticky because it is physiologically addictive. Twitter is sticky because we want to keep returning to see what is being said about and to us. Game of Thrones is sticky because viewers are drawn in by the narrative and want to know what happens next. Once something reaches a critical mass, stickiness can be considered to be the rate of decline. The more sticky something is, the slower its decline. Cat videos aren’t very sticky, so even the viral ones thankfully fade into the night quickly.

Finally, the third factor is the specific context; the circumstances, time, and place must be right for an epidemic to occur. Understanding how a tipping point works can help to clarify the concept of critical mass.

The 10% Rule

One big question is: what percentage of a population is necessary to create a critical mass?

According to researchers at Rensselaer Polytechnic Institute, the answer is a mere 10%. Computational analysis was used to establish where the shift from minority to majority lies. According to director of research Boleslaw Szymanski:

When the number of committed opinion holders is below 10 percent, there is no visible progress in the spread of ideas. It would literally take the amount of time comparable to the age of the universe for this size group to reach the majority. Once that number grows above 10 percent, the idea spreads like flame.

The research has shown that the 10% can comprise literally anyone in a given society. What matters is that those people are set in their beliefs and do not respond to pressure to change them. Instead, they pass their ideas on to others. (I’d argue that the percentage is lower. Much lower. See Dictatorship of the Minority.)

As an example, Szymanski cites the sudden revolutions in countries such as Egypt and Tunisia: “In those countries, dictators who were in power for decades were suddenly overthrown in just a few weeks.”

According to another researcher:

In general, people do not like to have an unpopular opinion and are always seeking to try locally to come to a consensus … As agents of change start to convince more and more people, the situation begins to change. People begin to question their own views at first and then completely adopt the new view to spread it even further. If the true believers just influenced their neighbors, that wouldn’t change anything within the larger system, as we saw with percentages less than 10.

The potential use of this knowledge is tremendous. Now that we know how many people are necessary to form a critical mass, this information can be manipulated — for good or evil. The choice is yours.