Category: Mental Models

Resonance: How to Open Doors For Other People

It’s only polite.

Hold the door open for others, and they will open doors for you.

We are far more interdependent than we would like to admit. We biologically need to connect. “Limbic resonance” is a term used by Thomas Lewis, Fari Amini, and Richard Lannon in their book. A General Theory of Love, to express the ability to share deep emotional states. The limbic lobe of the brain is what makes a mammalian brain what it is. Without it, a mammal would be reduced to a reptilian brain with the connective capacity of a snake or lizard. This is why reptiles are often felt to be scary—unreachable and heartless.

Resonance is not only a mammalian capacity but an outright necessity. Our infants will die if not provided with the warmth of connection with another being, despite being provided with all their physiological needs. This has been illustrated in inhumane 13th-century human ‘experiments’ by Frederick the Great depriving babies of human connection, and more recently by Harry Harlow in rhesus monkeys. Baby monkeys choose to spend 17 hours a day with a soft cloth mother figure that does not provide food compared to only one hour a day with a wire mother figure that actually provides milk. Connection is a far superior sustenance.

Via Life

An oft-quoted study by psychologist John Gottman suggests a partner’s ability to answer “emotional bids” to be strongly predictive of divorce. The divorce rate is higher in couples where partners do not resonate or fail to engage and respond to requests for attention. Those who divorced after a six-year follow-up were observed to have turned towards the other on only 30% of occasions a bid was made, whilst couples who were still together averaged closer to 90%. Furthermore, in A General Theory of Love, the authors convincingly argue that what we are actually doing is synchronising ourselves with one another, with deep impacts on our emotional and physical health.

This would be in keeping with the results of the well-known Harvard Study of Adult Development, which followed a large cohort of people over a lifetime. These types of studies are rare because they’re expensive and hard to carry out. This study was well worth investing in, with one clear overall conclusion: good relationships keep us happier and healthier. Its director, psychiatrist Robert Waldinger, states:

Well, the lessons aren’t about wealth or fame or working harder and harder. The clearest message that we get from this 75-year study is this: Good relationships keep us happier and healthier. Period.

We’ve learned three big lessons about relationships. The first is that social connections are really good for us, and that loneliness kills. It turns out that people who are more socially connected to family, to friends, to community, are happier, they’re physically healthier, and they live longer than people who are less well connected. And the experience of loneliness turns out to be toxic. People who are more isolated than they want to be from others find that they are less happy, their health declines earlier in midlife, their brain functioning declines sooner and they live shorter lives than people who are not lonely.

So what now? Where does that leave us?

People feel connected when they are understood and appreciated. My friend’s aunt taught her this when they walked together down a busy road. Her aunt stopped to talk to a homeless man. With no money to give him, she started asking questions about his dog, chatting to him about her own dog. The interaction took 30 seconds. The man’s eyes shone back bright, engaged. As they walked away, my friend’s aunt whispered, “People want to be recognized. It reminds them they exist. Never take that away from anyone.” Lesson learned.

Listen, Summarize, Show

I work hard to live that lesson through the following: listen, summarize, show. True, sustained listening is one of the hardest skills to achieve. I’ve met only a handful of people with the ability. A simple way to focus your attention is to listen with the intention of summarizing the other person’s point of view. This stops you from using your mental energy to work out your reply, and helps store the other’s words in your memory as well as identify any gaps in your understanding so you can ask questions to clarify.

The nature of these questions in themselves will show to the other person that they are heard and effort is being made to take them seriously. Just as it is not enough to know, when it comes to human relationships, it is not enough to understand. What is crucial is to show you understand. If empathy is recognizing another’s perspective, consideration for the other needs to be externalized from you for it to exist and build rapport.

Summarizing and asking questions is a way of feeding back your resonance. Cutting short the conversation, stating opinions, value judgements, your own solutions, or even a lazy “I see” or “interesting” does not demonstrate resonance. In fact, you can use “I understand” as a red flag for someone who does not understand. Often, this is followed by an action that shows a thorough lack of comprehension.

Connect Where It Matters

To resonate with others, we need to connect when it matters. This nurtures both us and others, and also earns trust. Just as in cooking, timing is everything.

This is where the metaphorical doors come in. How do you feel when someone holds the door open for you—especially when you’ve got your hands full? When would you hold open a door for another person?

We may kindly open a door, to find the person has no intention of walking through it and continues down the stairwell because they’re heading to the floor below. In this case, we did not understand their needs. We may even find ourselves bending over backwards for another, without consequence. This is the equivalent of opening doors willy-nilly down a long corridor without anyone walking through them.

At worst, we might inadvertently (or dare I say, even intentionally) slam a door in someone’s face. That will hurt—even more so if we had offered to hold it for them and they were counting on it to be open. Holding a door open at the right time represents tending to a perceived need and meeting expectations.

All people want to be understood and appreciated. By connecting in this way, they trust you understand them and are actually looking out for their interests. You are attentive and willing to open doors for them. The power of resonance will keep you happy and healthy and open doors for you.

Battling Entropy: Making Order of the Chaos in Our Lives

“Nor public flame, nor private, dares to shine;
Nor human spark is left, nor glimpse divine!
Lo! thy dread empire, Chaos! is restored;
Light dies before thy uncreating word:
Thy hand, great Anarch! lets the curtain fall;
And universal darkness buries all.”
― Alexander Pope, The Dunciad

***

The second law of thermodynamics states that “as one goes forward in time, the net entropy (degree of disorder) of any isolated or closed system will always increase (or at least stay the same).”[1] That is a long way of saying that all things tend towards disorder. This is one of the basic laws of the universe and is something we can observe in our lives. Entropy is simply a measure of disorder. You can think of it as nature’s tax[2].

Uncontrolled disorder increases over time. Energy disperses and systems dissolve into chaos. The more disordered something is, the more entropic we consider it. In short, we can define entropy as a measure of the disorder of the universe, on both a macro and a microscopic level. The Greek root of the word translates to “a turning towards transformation” — with that transformation being chaos.

As you read this article, entropy is all around you. Cells within your body are dying and degrading, an employee or coworker is making a mistake, the floor is getting dusty, and the heat from your coffee is spreading out. Zoom out a little, and businesses are failing, crimes and revolutions are occurring, and relationships are ending. Zoom out a lot further and we see the entire universe marching towards a collapse.

Let’s take a look at what entropy is, why it occurs, and whether or not we can prevent it.

The Discovery of Entropy

The identification of entropy is attributed to Rudolf Clausius (1822–1888), a German mathematician and physicist. I say attributed because it was a young French engineer, Sadi Carnot (1796–1832), who first hit on the idea of thermodynamic efficiency; however, the idea was so foreign to people at the time that it had little impact. Clausius was oblivious to Carnot’s work, but hit on the same ideas.

Clausius studied the conversion of heat into work. He recognized that heat from a body at a high temperature would flow to one at a lower temperature. This is how your coffee cools down the longer it’s left out — the heat from the coffee flows into the room. This happens naturally. But if you want to heat cold water to make the coffee, you need to do work — you need a power source to heat the water.

From this idea comes Clausius’s statement of the second law of thermodynamics: “heat does not pass from a body at low temperature to one at high temperature without an accompanying change elsewhere.”

Clausius also observed that heat-powered devices worked in an unexpected manner: Only a percentage of the energy was converted into actual work. Nature was exerting a tax. Perplexed, scientists asked, where did the rest of the heat go and why?

Clausius solved the riddle by observing a steam engine and calculating that energy spread out and left the system. In The Mechanical Theory of Heat, Clausius explains his findings:

… the quantities of heat which must be imparted to, or withdrawn from a changeable body are not the same, when these changes occur in a non-reversible manner, as they are when the same changes occur reversibly. In the second place, with each non-reversible change is associated an uncompensated transformation…

… I propose to call the magnitude S the entropy of the body… I have intentionally formed the word entropy so as to be as similar as possible to the word energy….

The second fundamental theorem [the second law of thermodynamics], in the form which I have given to it, asserts that all transformations occurring in nature may take place in a certain direction, which I have assumed as positive, by themselves, that is, without compensation… [T]he entire condition of the universe must always continue to change in that first direction, and the universe must consequently approach incessantly a limiting condition.

… For every body two magnitudes have thereby presented themselves—the transformation value of its thermal content [the amount of inputted energy that is converted to “work”], and its disgregation [separation or disintegration]; the sum of which constitutes its entropy.

Clausius summarized the concept of entropy in simple terms: “The energy of the universe is constant. The entropy of the universe tends to a maximum.”

“The increase of disorder or entropy is what distinguishes the past from the future, giving a direction to time.”

— Stephen Hawking, A Brief History of Time

Entropy and Time

Entropy is one of the few concepts that provides evidence for the existence of time. The “Arrow of Time” is a name given to the idea that time is asymmetrical and flows in only one direction: forward. It is the non-reversible process wherein entropy increases.

Astronomer Arthur Eddington pioneered the concept of the Arrow of Time in 1927, writing:

Let us draw an arrow arbitrarily. If as we follow the arrow[,] we find more and more of the random element in the state of the world, then the arrow is pointing towards the future; if the random element decreases[,] the arrow points towards the past. That is the only distinction known to physics.

In a segment of Wonders of the Universe, produced for BBC Two, physicist Brian Cox explains:

The Arrow of Time dictates that as each moment passes, things change, and once these changes have happened, they are never undone. Permanent change is a fundamental part of what it means to be human. We all age as the years pass by — people are born, they live, and they die. I suppose it’s part of the joy and tragedy of our lives, but out there in the universe, those grand and epic cycles appear eternal and unchanging. But that’s an illusion. See, in the life of the universe, just as in our lives, everything is irreversibly changing.

In his play Arcadia, Tom Stoppard uses a novel metaphor for the non-reversible nature of entropy:

When you stir your rice pudding, Septimus, the spoonful of jam spreads itself round making red trails like the picture of a meteor in my astronomical atlas. But if you stir backwards, the jam will not come together again. Indeed, the pudding does not notice and continues to turn pink just as before. Do you think this is odd?

(If you want to dig deeper on time, I recommend the excellent book by John Gribbin, The Time Illusion.)

“As a student of business administration, I know that there is a law of evolution for organizations as stringent and inevitable as anything in life. The longer one exists, the more it grinds out restrictions that slow its own functions. It reaches entropy in a state of total narcissism. Only the people sufficiently far out in the field get anything done, and every time they do they are breaking half a dozen rules in the process.”

— Roger Zelazny, Doorways in the Sand

Entropy in Business and Economics

Most businesses fail—as many as 80% in the first 18 months alone. One way to understand this is with an analogy to entropy.

Entropy is fundamentally a probabilistic idea: For every possible “usefully ordered” state of molecules, there are many, many more possible “disordered” states. Just as energy tends towards a less useful, more disordered state, so do businesses and organizations in general. Rearranging the molecules — or business systems and people — into an “ordered” state requires an injection of outside energy.

Let’s imagine that we start a company by sticking 20 people in an office with an ill-defined but ambitious goal and no further leadership. We tell them we’ll pay them as long as they’re there, working. We come back two months later to find that five of them have quit, five are sleeping with each other, and the other ten have no idea how to solve the litany of problems that have arisen. The employees are certainly not much closer to the goal laid out for them. The whole enterprise just sort of falls apart.

It reminds one distinctly of entropy: For every useful arrangement of affairs towards a common business goal, there are many orders of magnitude more arrangements that will get us nowhere. For progress to be made, everything needs to be arranged and managed in a certain way; we have to input a lot of energy to keep things in an ordered state.

Of course, it’s not a perfect analogy: We have to consider the phenomenon of self-organization that happens in many systems, up to and including human organizations. Given a strong enough goal, a good enough team, and the right incentives, perhaps that group wouldn’t need a lot “outside ordering” — they would manage themselves.

“The … ultimate purpose of life, mind, and human striving: to deploy energy and information to fight back the tide of entropy and carve out refuges of beneficial order.”

— Steven Pinker

In practice, both models seem to be useful at different times. Any startup entrepreneur who has stayed long enough to see a company thrive in unexpected ways knows this. The amount of diligent management needed will vary. In physics, entropy is a law; in social systems, it’s a mere tendency — though a strong one, to be sure.

Entropy occurs in every aspect of a business. Employees may forget training, lose enthusiasm, cut corners, and ignore rules. Equipment may break down, become inefficient, or be subject to improper use. Products may become outdated or be in less demand. Even the best of intentions cannot prevent an entropic slide towards chaos.

Successful businesses invest time and money to minimize entropy. For example, they provide regular staff training, good reporting of any issues, inspections, detailed files, and monitoring reports of successes and failures. Anything less will mean almost inevitable problems and loss of potential revenue. Without the necessary effort, a business will reach the point of maximum entropy: bankruptcy.

Fortunately, unlike thermodynamic systems, a business can reverse the impact of entropy. A balance must be struck between creativity and control, though. Too little autonomy for employees results in disinterest, while too much leads to poor decisions.

Entropy in Sociology

Without constant maintenance from individuals and dominant institutions, societies tend towards chaos. Divergent behavior escalates — a concept known as the “broken windows” theory.

Sociologist Kenneth Bailey writes:

When I began studying the notion of entropy it became clear to me that thermodynamic entropy was merely one instance of a concept with much broader applications … I became convinced that entropy applied to social phenomena as well.

One example of what happens when entropy increases unchecked occurred in the Kowloon Walled City. For a substantial length of time, Kowloon was abandoned by the government after the British took control of Hong Kong. At one point, an estimated 33,000 residents were crammed into 300 buildings over 6.4 acres, making Kowloon the most densely populated place on earth. With no space for new construction, stories were added to the existing buildings. Because of minimal water supplies and a lack of ventilation (no sunlight or fresh air reached lower levels), the health of residents suffered. A community of unlicensed medical professionals flourished, alongside brothels and gambling dens.

With no one controlling the city, organized crime gangs took over. It became a haven for lawlessness. Though police were too scared to make any attempts to restore order, residents did make desperate attempts to reduce the entropy themselves. Groups formed to improve the quality of life, creating charities, places for religious practices, nurseries, and businesses to provide income.

In 1987, the Hong Kong government acknowledged the state of Kowloon. The government demolished and rebuilt the city, evicting residents and destroying all but a couple of historic buildings. Although reasonable compensation was provided for ex-residents, many were somewhat unhappy with the rebuilding project.

Looking at pictures and hearing stories from Kowloon, we have to wonder if all cities would be that way without consistent control. Was Kowloon an isolated instance of a few bad apples giving an otherwise peaceful place a terrible reputation? Or is chaos our natural state?

Needless to say, Kowloon was not an isolated incident. We saw chaos and brutality unleashed during the Vietnam War, when many young men with too much ammunition and too few orders set about murdering and torturing every living thing they encountered. We see it across the world right now, where places with no law enforcement (including Somalia and Western Sahara) face incessant civil wars, famine, and high crime rates.

Sociologists use an intuitive term for this phenomenon: social entropy. Societies must expend constant effort to stem the inevitable march towards dangerous chaos. The reduction of social entropy tends to require a stable government, active law enforcement, an organized economy, meaningful employment for a high percentage of people, infrastructure, and education.

However, the line between controlling entropy and suppressing people’s liberty is a thin one. Excessive control can lead to a situation akin to Foucault’s panopticon, wherein people are under constant surveillance, lack freedom of speech and movement, are denied other rights as well, and are subject to overzealous law enforcement. This approach is counterproductive and leads to eventual rebellion once a critical mass of dissenters forms.

“Everything that comes together falls apart. Everything. The chair I’m sitting on. It was built, and so it will fall apart. I’m going to fall apart, probably before this chair. And you’re going to fall apart. The cells and organs and systems that make you you—they came together, grew together, and so must fall apart. The Buddha knew one thing science didn’t prove for millennia after his death: Entropy increases. Things fall apart.”

— John Green, Looking for Alaska

Entropy in Our Everyday Lives

We have all observed entropy in our everyday lives. Everything tends towards disorder. Life always seems to get more complicated. Once-tidy rooms become cluttered and dusty. Strong relationships grow fractured and end. Formerly youthful faces wrinkle and hair turns grey. Complex skills are forgotten. Buildings degrade as brickwork cracks, paint chips, and tiles loosen.

Entropy is an important mental model because it applies to every part of our lives. It is inescapable, and even if we try to ignore it, the result is a collapse of some sort. Truly understanding entropy leads to a radical change in the way we see the world. Ignorance of it is responsible for many of our biggest mistakes and failures. We cannot expect anything to stay the way we leave it. To maintain our health, relationships, careers, skills, knowledge, societies, and possessions requires never-ending effort and vigilance. Disorder is not a mistake; it is our default. Order is always artificial and temporary.

Does that seem sad or pointless? It’s not. Imagine a world with no entropy — everything stays the way we leave it, no one ages or gets ill, nothing breaks or fails, everything remains pristine. Arguably, that would also be a world without innovation or creativity, a world without urgency or a need for progress.

Many people cite improving the world for future generations as their purpose in life. They hold protests, make new laws, create new forms of technology, work to alleviate poverty, and pursue other noble goals. Each of us makes our own efforts to reduce disorder. The existence of entropy is what keeps us on our toes.

Mental models are powerful because they enable us to make sense of the disorder that surrounds us. They provide us with a shortcut to understanding a chaotic world and exercising some control over it.

In The Information: A History, a Theory, a Flood, James Gleick writes,

Organisms organize. … We sort the mail, build sand castles, solve jigsaw puzzles, separate wheat from chaff, rearrange chess pieces, collect stamps, alphabetize books, create symmetry, compose sonnets and sonatas, and put our rooms in order… We propagate structure (not just we humans but we who are alive). We disturb the tendency toward equilibrium. It would be absurd to attempt a thermodynamic accounting for such processes, but it is not absurd to say we are reducing entropy, piece by piece. Bit by bit … Not only do living things lessen the disorder in their environments; they are in themselves, their skeletons and their flesh, vesicles and membranes, shells, and carapaces, leaves, and blossoms, circulatory systems and metabolic pathways—miracles of pattern and structure. It sometimes seems as if curbing entropy is our quixotic purpose in the universe.

The question is not whether we can prevent entropy (we can’t), but how we can curb, control, work with, and understand it. As we saw at the start of this post, entropy is all around us. Now it’s probably time to fix whatever mistake an employee or coworker just made, clear up your messy desk, and reheat your cold coffee.

How Can I Use Entropy to My Advantage?

This is where things get interesting.

Whether you’re starting a business or trying to bring about change in your organization, understanding the abstraction of entropy as a mental model will help you accomplish your goals in a more effective manner.

Because things naturally move to disorder over time, we can position ourselves to create stability. There are two types of stability: active and passive. Consider a ship, which, if designed well, should be able to sail through a storm without intervention. This is passive stability. A fighter jet, in contrast, requires active stability. The plane can’t fly for more than a few seconds without having to adjust its wings. This adjustment happens so fast that it’s controlled by software. There is no inherent stability here: if you cut the power, the plane crashes.[3]

People get in trouble when they confuse the two types of stability. Relationships, for example, require attention and care. If you assume that your relationship is passively stable, you’ll wake up one day to divorce papers. Your house is also not passively stable. If not cleaned on a regular basis, it will continue to get messier and messier.

Organizations require stability as well. If you’re a company that relies on debt, you’re not passively stable but actively stable. Factoring in a margin of safety, this means that the people giving you the credit should be passively stable. If you’re both actively stable, then when the power gets cut, you’re likely to be in a position of weakness, not strength.

With active stability, you’re applying energy to a system in order to bring about some advantage (keeping the plane from crashing, your relationship going, the house clean, etc.), If we move a little further down the rabbit hole, we can see how applying the same amount of energy can yield totally different results.

Let’s use the analogy of coughing.[4] Coughing is the transfer of energy as heat. If you cough in a quiet coffee shop, which you can think of as a system with low entropy, you cause a big change. Your cough is disruptive. On the other hand, if you cough in Times Square, a system with a lot of entropy, that same cough will have no impact. While you change the entropy in both cases, the impact you have with the same cough is proportional to the existing entropy.

Now think of this example in relation to your organization. You’re applying energy to get something done. The higher the entropy in the system, the less efficient the energy you apply will be. The same person applying 20 units of energy in a big bureaucracy is going to see less impact than someone applying the same 20 units in a small startup.

You can think about this idea in a competitive sense, too. If you’re starting a business and you’re competing against very effective and efficient people, a lot of effort will get absorbed. It’s not going to be very efficient. If, on the other hand, you compete against less efficient and effective people, the same amount of energy will be more efficient in its conversion.

In essence, for a change to occur, you must apply more energy to the system than is extracted by the system.

 

Members of the FS Learning Community can discuss this article here.
If you’re not a member see what you’re missing.

 

 

Resources:

[1] http://www.exactlywhatistime.com/physics-of-time/the-arrow-of-time/

[2] Peter Atkins

[3] Based on the work of Tom Tombrello

[4] Derived from the work of Peter Atkins in The Laws of Thermodynamics: A Very Short Introduction

 

The Surprising Power of The Long Game

It’s easy to overestimate the importance of luck on success and underestimate the importance of investing in success every single day. Too often, we convince ourselves that success was just luck. We tell ourselves, the school teacher that left millions was just lucky. No. She wasn’t. She was just playing a different game than you were. She was playing the long game.

The long game isn’t particularly notable and sometimes it’s not even noticeable. It’s boring. But when someone chooses to play the long game from an early age, the results can be extraordinary. The long game changes how you conduct your personal and business affairs.

There is an old saying that I think of often, but I’m not sure where it comes from: If you do what everyone else is doing, you shouldn’t be surprised to get the same results everyone else is getting.

Ignoring the effect of luck on outcomes — the proverbial lottery ticket —doing what everyone else is doing pretty much ensures that you’re going to be average. Not average in the world, but average to people in similar circumstances. There are a lot of ways not to be average, but one of them is the tradeoff between the long game and the short game.

What starts small compounds into something more. The longer you play the long game, the easier it is to play and the greater the rewards. The longer you play the short game the harder it becomes to change and the bigger the bill facing you when you do want to change.

The Short Game

The short game is putting off anything that seems hard for doing something that seems easy or fun. The short game offers visible and immediate benefits. The short game is seductive.

  • Why do your homework when you can go out and play?
  • Why wait to pay for a phone in cash, when you can put it on your credit card?
  • Why go to the gym when you can go drinking with your friends?
  • Why invest in your relationship with your partner today when you can work a little bit extra in the office?
  • Why learn something boring that doesn’t change when you can learn something sexy that impresses people?
  • Why bust your butt at work to do the work before the meeting when you can read the executive summary and pretend like everyone else?

The effects of the short game multiply the longer you play. On any given day the impact is small but as days turn into months and years the result is enormous. People who play the short game don’t realize the costs until they become too large to ignore.

The problem with the short game is that the costs are small and never seem to matter much on any given day. Doing your homework today won’t give you straight A’s. Saving $5 today won’t make you a millionaire. Going to the gym and eating healthy today won’t make you fit. Reading a book won’t make you smart. Going to sleep on time tonight won’t make you healthier tomorrow. Sure we might try these things when we’re motivated but since the results are not immediate we revert back to the short game.

As the weeks turn into months and the months into years, the short game compounds into disastrous results. It’s not the one day trade off that matters but it’s accumulation.

Playing the long game means suffering a little today. And why would we want to suffer today when we can suffer tomorrow. But if our intention is to always change tomorrow, then tomorrow never comes. All we have is today.

The Long Game

The long game is the opposite of the short game, it means paying a small price today to make tomorrow’s tomorrow easier. If we can do this long enough to see the results, it feeds on itself.

From the outside, the long game looks pretty boring:

  • Saving money and investing it for tomorrow
  • Leaving the party early to go get some sleep
  • Investing time in your relationship today so you have a foundation when something happens
  • Doing your homework before you go out to play
  • Going to the gym rather than watching Netflix

… and countless other examples.

In its simplest form, the long game isn’t really debatable. Everyone agrees, for example, we should spend less than we make and invest the difference. Playing the long game is a slight change, one that seems insignificant at the moment, but one that becomes the difference between financial freedom and struggling to make next month’s rent.

The first step to the long game is the hardest. The first step is visibly negative. You have to be willing to suffer today in order to not suffer tomorrow. This is why the long game is hard to play. People rarely see the small steps when they’re looking for enormous outcomes, but deserving enormous outcomes is mostly the result of a series of small steps that culminate into something visible.

Conclusion

In everything you do, you’re either playing a short term or long term game. You can’t opt out and you can’t play a long-term game in everything, you need to pick what matters to you. But in everything you do time amplifies the difference between long and short-term games. The question you need to think about is when and where to play a long-term game. A good place to start is with things that compound: knowledge, relationships, and finances.

 

This article is an expansion of something I originally touched on here

Winner Takes it All: How Markets Favor the Few at the Expense of the Many

Markets tend to favor unequal distributions of market share and profits, with a few leaders emerging in any industry. Winner-take-all markets are hard to disrupt and suppress the entry of new players by locking in market share for leading players.

***

In almost any market, crowds of competitors fight for business within their niche. But over time, with few exceptions, a small number of companies come to dominate the industry.

These are the names we all know. The logos we see every day. The brands which shape the world with every decision they make. Even those which are not household names have a great influence on our lives. Operating behind the scenes, they quietly grow more powerful each year, often sowing the seeds of their own destruction in the process.

A winner-take-all market doesn’t mean there is only one company in the market. Rather, when we say a winner takes all, what we mean is that a single company receives the majority of available profits. A few others have at best a modest share. The rest fight over a miniscule remnant, and tend not to survive long.

In a winner-take-all market, the winners have tremendous power to dictate outcomes. Winner-take-all markets occur in many different areas. We can apply the concept to all situations which involve unequal distributions.

Unequal Distribution

As a general rule, resources are never distributed evenly among people. In almost every situation, a small number of people or organizations are the winners.

Most of the books sold each year are written by a handful of authors. Most internet traffic is to a few websites. The top 100 websites get more traffic than ranks 100-999 combined (welcome to power laws). Most citations in any field refer to the same few papers and researchers. Most clicks on Google searches are on the first result. Each of these is an instance of a winner-take-all market.

Wealth is a prime example of this type of market. The Pareto Principle states that in a given nation, 20% of the people own 80% of the wealth (the actual figures are 15% and 85%.) However, the Pareto Principle goes deeper than that. We can look at the richest 20%, then calculate the wealth of the richest 20% of that group. Once again, the Pareto principle applies. So roughly 4% own 64% of the wealth. Keep repeating that calculation and we end up with about 9 people. By some estimates, this tiny group has as much as wealth as the poorest half of the world.

“With limited time or opportunity to experiment, we intentionally narrow our choices to those at the top.”

— Seth Godin

The Perks of Being the Best

There are tremendous benefits to being the best in any particular area. Top performers might be only slightly more skilled than the people one level below them, yet they receive an exponential payoff. A small difference in relative performance—an athlete who can run 100 meters a few microseconds faster, a leader who can make better decisions, an opera singer who can go a little higher—can mean the difference between a lucrative career and relative obscurity. The people at the tops of their fields get it all. They are the winners in that particular market. And once someone is regarded as the best, they tend to retain that status. It takes a monumental effort for a newcomer to rise to such a position. Every day new people do make it to the top, but it’s a lot easier to stay there than to get there.

Top performers don’t just earn the most. They also tend to receive the majority of media coverage and win most awards. They have the most leverage when it comes to choosing their work. These benefits are exponential, following a power law distribution. A silver medalist might get 10 times the benefits the bronze medalist does. But the gold medalist will receive 10 times the benefits of the silver. If a company is risking millions over a lawsuit, they will want the best possible lawyer no matter the cost. And a surgeon who is 10% better than average can charge more than 10% higher fees. When someone or something is the best, we hear about it. The winners take all the attention. It’s one reason why the careers of Nobel Prize winners tend to go downhill after receiving the award. It becomes too lucrative for them to devote their time to the media, giving talks or writing books. Producing more original research falls by the wayside.

Leverage

One reason the best are rewarded more now than ever is leverage. Up until recently, if you were a nanosecond faster than someone else, there was no real advantage. Now there is. Small differences in performance translate into large differences in real-world benefits. A gold medallist in the Olympics, even one that wins by a nanosecond, is disproportionately rewarded for a very small edge.

Now we all live in a world of leverage, through capital, technology, and productivity. Leveraged workers can outperform unleveraged ones by orders of magnitude. When you’re leveraged, judgment becomes far more important. That small difference in ability can be put to better use. Software engineers can create billions of dollars of value through code. Ten coders working 10 times harder but slightly less effective in their thinking will have nothing to show for it. Just as with winner-take-all markets, the inputs don’t match the outputs.

Feedback Loops

Economist Sherwin Rosen looked at unequal distribution in The Economics of Superstars. Rosen found that the demand for classical music and live comedy is high and continues to grow. Yet each area only employs about two hundred full-time performers. These top-performing comedians and musicians take most of the market. Meanwhile, thousands of others struggle for any recognition. Performers regarded as second best within a field earn considerably less than the top performers, even though the average person cannot discern any difference.

In Success and Luck, Robert H. Frank explains the self-perpetuating nature of winner take all markets:

Is the Mona Lisa special? Is Kim Kardashian? They’re both famous, but sometimes things are famous just for being famous. Although we often try to explain their success by scrutinising their objective qualities, they are in fact often no more special than many of their less renowned counterparts…Success often results from positive feedback loops that amplify tiny initial variations into enormous differences in final outcomes.

Winner-take-all markets are increasingly dictated by feedback loops. Feedback loops develop when the output becomes the input. Consider books. More people will buy a best-selling book because it’s a best-selling book. More people will listen to a song that tops charts. More people will go to see an Oscar winning film. These feedback loops serve to magnify initial luck or manipulation. Some writers will purchase thousands of copies of their own book to push it onto best seller lists. Once it makes it onto the list, the feedback loop will begin and possibly keep it there longer than it merits.1

It’s hard to establish what sets off these feedback loops. In many cases, the answer is simple: luck. Although many people and organizations create narratives to explain their achievements, luck plays a large role. This is a combination of hindsight bias and the narrative fallacy. In retrospect, becoming the winner in the market seems inevitable. In truth, luck plays a substantial role in the creation of winner-take-all markets. A combination of timing, location and connections serves to create winners. Their status is never inevitable, no matter what they might tell those who ask.

In some cases, governments deliberately strive to create positive feedback loops. Drug patents are one example. These create a powerful incentive for companies to invest in research and development. Releasing a new, copyrighted drug is a lucrative enterprise. As the only company in that particular market, a company can set the price to whatever it wishes. Until the patent runs out, that company is the winner. This is exactly how the market plays out. In 2016, the highest grossing drug company earned $71 billion. The three runners up each earned around $50 billion. From there on, the other drug companies have a comparatively small share of the market.

Profit enables companies to invest in more research and development, pay employees more, and invest in their communities. A positive feedback loop forms. Talented researchers join successful teams. They gather valuable data. Developing new drugs becomes easier. Drug companies gain greater and greater market power over time. A few winners end up with almost total control. They become the names we trust and hold their position, absorbing any risks or scandals. New effective drugs benefit society on the whole, improving our well-being. This winner-take-all market has its upsides. Issues emerge when patent holders set prices above the means of the people who need the drugs most.

Once the patent runs out on a drug (generally after 12 years) any other firm can produce an identical product. Prices soon fall as other companies enter the market. The feedback loop breaks, and the winner no longer takes all. Even so, the former winner will retain a large share of the market. People tend to be unwilling to switch to a new brand of drug, even if it has the same effects.

Ironically, winner-take-all markets tend to perpetuate themselves by attracting more losers. When we look at founders in Silicon Valley or actors in LA, we don’t see the failures. Survivorship bias means we only see those who succeed. Attracted by the thought of winning, growing numbers of people flock to try their luck in the market. Most fail, overconfident and misled. The rewards become even more concentrated. More people are attracted and the cycle continues.

DeBeers Diamonds

In the market for diamonds, there is one main winner: DeBeers. This international corporation controls most of the global diamond market, including mining, trading and retail. For around a century, DeBeers had a complete monopoly. Diamonds are a scarce Veblen good with minimal practical use. The value depends on our perception.

Prior to the late 19th century, the global production of diamonds totaled a couple of pounds a year. Demand barely existed, so no one had much interest in supplying it. However, the discovery of several large mines increased production from pounds to tons. Those who stood to profit recognized that diamonds have no intrinsic value. They needed to create a perception of scarcity. DeBeers began taking control of the market in 1888, quickly forming a monopoly. It had an ambitious vision for the diamond market. DeBeers wanted to promote the stones as irreplaceable. Other gemstones have basically the same properties—hard, shiny rocks which make nice jewelry. As Edward Jay Epstein wrote in 1982:

The diamond invention is far more than a monopoly for fixing diamond prices; it is a mechanism for converting tiny crystals of carbon into universally recognized tokens of wealth, power, and romance. To achieve this goal, De Beers had to control demand as well as supply. Both women and men had to be made to perceive diamonds not as marketable precious stones but as an inseparable part of courtship and married life.

Their ensuing role as winners in the diamond market is all down to clever marketing. Slogans such as “diamonds are forever” have cemented the monopoly. Note that the slogan applies to all diamonds, not their particular brand. Imagine if Apple made adverts declaring “phones are forever”. Or if McDonald’s made adverts saying ”fast food is forever.” That’s how powerful DeBeers is. It can promote the entire market, knowing it will be the one to benefit. Throughout the twentieth century, DeBeer gave famous actresses diamond rings, pitched stories featuring the stones to magazines and incorporated their products into images of the British royal family. As their advertising agency, N. W. Ayer, explained, “There was no direct sale to be made. There was no brand name to be impressed on the public mind. There was simply an idea—the eternal emotional value surrounding the diamond…. The substantial diamond gift can be made a more widely sought symbol of personal and family success—an expression of socioeconomic achievement.”

The Impact of Technology

In our interconnected, globalized world, a few large firms continue to grow in power. Modern technology enables firms like Walmart to open branches all over the world. Without the barriers once associated with communication and supply networks, large firms can take over the local market anywhere they open. Small businesses have a very hard time competing.

When a new market appears, entrepreneurs rush to create products, services or technology. There is a flurry of activity for a few months or year. With time, customers gravitate toward the two or three companies they prefer. Starved of revenue, the other competitors shut down. Technology has exacerbated the growth of winner-take-all markets.

We are seeing this at the moment with ride-hailing services. In a once-crowded marketplace, two giant winners remain to take all the profits. It’s hard to say exactly why Uber and Lyft triumphed over numerous similar services. But it’s unlikely they will lose their market share anytime soon.

The same occurred with search engines. Google has now eliminated any meaningful competition. As their profits soar each year, even their nearest competitors—Yahoo, Bing—struggle. We can see from the example of Google how winner-take-all markets can self-perpetuate. Google is on top, so it gets the best employees, and has high research and development budgets. Google can afford to take risks and accumulate growing mountains of user data. Any losses or failures get absorbed. Consistent growth holds the trust of shareholders. Google essentially uses a form of Nassim Taleb’s barbell strategy. As Taleb writes in The Black Swan:

True, the Web produces acute concentration. A large number of users visit just a few sites, such as Google, which, at the time of this writing, has total market dominance. At no time in history has a company grown so dominant so quickly—Google can service people from Nicaragua to southwestern Mongolia to the American West Coast, without having to worry about phone operators, shipping, delivery, and manufacturing. This is the ultimate winner-take-all case study. People forget, though, that before Google, Alta Vista dominated the search-engine market. I am prepared to revise the Google metaphor by replacing it with a new name for future editions of this book.

The role of data is particularly important. The more data a company has on its customers, the better equipped it is to release new products and market existing ones. Facebook has a terrifying amount of information about its users, so it can keep updating the social network to make it addictive and to lock people in. Newer or less popular social networks are working with less data and cannot compete for attention. A positive feedback loop forms for the entrenched companies. Facebook has a lot of data, and it can use that data to make the site more appealing. In turn, this more attractive Facebook leads people to spend more time clicking and generates even more data.2

Winner-take-all markets can be the result of lock-in. When the costs of switching between one supplier and another are too high to be worthwhile, consumers become locked in. Microsoft is a winner in the software market because most of the world is locked in to their products. As it stands, it would be nearly impossible for anyone to erode the market share Windows possesses. As Windows is copyrighted, no one can replicate it. Threatened by inconvenience, we become loyal to avoid incurring switching costs.

Marc Andreessen described the emergence of winner-take-all technology markets in 2013:

In normal markets, you can have Pepsi and Coke. In technology markets, in the long run, you tend to only have one…. The big companies, though, in technology tend to have 90 percent market share. So we think that generally, these are winner-take-all markets. Generally, number one is going to get like 90 percent of the profits. Number two is going to get like 10 percent of the profits, and numbers three through 10 are going to get nothing.

Leaders in certain areas are becoming winners and taking all because they can leverage small advantages, thanks to technology. In the past, an amazing teacher, singer, accountant, artist or stock broker could only reach a small number of people in their community. As their status grew, they would often charge more and choose to see fewer people, meaning their expertise became even more scarce. Now, however, those same top performers can reach a limitless audience through blogs, podcasts, videos, online courses and so on.

Think of it another way. For most of history we were limited to learning from the people in our community. Say you wanted to learn how to draw. You had access to your community art teacher. The odds they were the best art teacher in the world were extremely slim. Now, however, you can go on the internet and access the best teacher in the world.

For most of history, comedians (or rather, their predecessors such as vaudeville performers) and musicians performed live. There was a distinct limit to how many shows they could do a year and how many people could attend each. So, there were many people at the top of each field, as many as needed to meet audience demand for performers. Now that we are no longer confined to live performances, we gravitate towards a few exceptional entertainers. Or consider the example of sports. Athletes were paid far more modest wages until TV allowed them to leverage their skills and reach millions of homes.

Having more information available offers us further incentives to pay attention only to the winners. Online, we can filter by popularity, look at aggregate reviews, select the first search option, or go with other people’s preferences. With too many options, we google ‘best Chinese restaurant near me’ or ‘best horror film 2016.’ Sorting through all the options is too time-consuming, so the best stay as the best.

“In order to win, you must first survive.”

— Warren Buffett

The Downsides of Winner-Take-All Markets

There are some serious downsides to winner-take-all markets. Economic growth and innovation rely on the emergence of new startups and entrepreneurs with disruptive ideas. When the gale of creative destruction stops blowing, industries stagnate. When a handful of winners control a market, they may discourage newcomers who cannot compete with established giants’ budgets and power over the industry. According to some estimates, startups are failing faster and more frequently than in the past. Investors prefer established companies with secure short-term returns. Even when a startup succeeds, it tends to get acquired by a larger company. Apple, Amazon, Facebook and others acquire hundreds of companies each year.

Winner-take-all markets tend to discourage collaboration and cooperation. The winners have incentive to keep their knowledge and new data to themselves. Patents and copyright are liberally used to suppress any serious competition. Skilled workers are snapped up the second they leave education, and have powerful inducements to stay working for the winners. The result is a prisoner’s dilemma-style situation. Although collaboration may be best for everyone, each individual organization benefits from being selfish. As a result, no one collaborates, they just compete.

The result is what Warren Buffett calls a “moat’—a substantial barrier against competition. Business moats come in many forms. Apple’s superior brand identity is a moat, for example. It has taken enormous investments of resources to build and newer companies cannot compete. No number of Facebook adverts or billboards could replicate the kind of importance Apple has in our cultural consciousness. For other winners, the moat could be the ability to provide a product or service at a lower price than competitors, as with Amazon and Alibaba. Each of these has a great deal of market power and can influence prices. If Amazon drops their prices, competitors have no choice but to do the same and make less profit. If Apple decides to raise their prices, we are unlikely to buy our phones and laptops elsewhere and will pay a premium. As Greg Mankiw writes in Principles of Microeconomics, “Market power can cause markets to be inefficient because it keeps the price and quantity away from the equilibrium of supply and demand.”

Luckily for us, winners tend to sow the seeds of their own destruction—but we’ll save that for another article.

Members of the Farnam Street Learning Community can discuss this on the member forum.

Footnotes
  • 1

    For related thoughts see activation energy and escape velocity.

  • 2

    An argument could be made, that data should be anonymized and available to the public as a means to ensure competition.

Predicting the Future with Bayes’ Theorem

In a recent podcast, we talked with professional poker player Annie Duke about thinking in probabilities, something good poker players do all the time. At the poker table or in life, it’s really useful to think in probabilities versus absolutes based on all the information you have available to you. You can improve your decisions and get better outcomes. Probabilistic thinking leads you to ask yourself, how confident am I in this prediction? What information would impact this confidence?

Bayes’ Theorem

Bayes’ theorem is an accessible way of integrating probability thinking into our lives. Thomas Bayes was an English minister in the 18th century, whose most famous work, “An Essay toward Solving a Problem in the Doctrine of Chances,” was brought to the attention of the Royal Society in 1763—two years after his death—by his friend Richard Price. The essay did not contain the theorem as we now know it, but had the seeds of the idea. It looked at how we should adjust our estimates of probabilities when we encounter new data that influence a situation. Later development by French scholar Pierre-Simon Laplace and others helped codify the theorem and develop it into a useful tool for thinking.

Knowing the exact math of probability calculations is not the key to understanding Bayesian thinking. More critical is your ability and desire to assign probabilities of truth and accuracy to anything you think you know, and then being willing to update those probabilities when new information comes in. Here is a short example, found in Investing: The Last Liberal Art, of how it works:

Let’s imagine that you and a friend have spent the afternoon playing your favorite board game, and now, at the end of the game, you are chatting about this and that. Something your friend says leads you to make a friendly wager: that with one roll of the die from the game, you will get a 6. Straight odds are one in six, a 16 percent probability. But then suppose your friend rolls the die, quickly covers it with her hand, and takes a peek. “I can tell you this much,” she says; “it’s an even number.” Now you have new information and your odds change dramatically to one in three, a 33 percent probability. While you are considering whether to change your bet, your friend teasingly adds: “And it’s not a 4.” With this additional bit of information, your odds have changed again, to one in two, a 50 percent probability. With this very simple example, you have performed a Bayesian analysis. Each new piece of information affected the original probability, and that is Bayesian [updating].

Both Nate Silver and Eliezer Yudkowsky have written about Bayes’ theorem in the context of medical testing, specifically mammograms. Imagine you live in a country with 100 million women under 40. Past trends have revealed that there is a 1.4% chance of a woman under 40 in this country getting breast cancer—so roughly 1.4 million women.

Mammograms will detect breast cancer 75% of the time. They will give out false positives—say a woman has breast cancer when she actually doesn’t—about 10% of the time. At first, you might focus just on the mammogram numbers and think that 75% success rate means that a positive is bad news. Let’s do the math.

If all the women under 40 get mammograms, then the false positive rate will give 10 million women under 40 the news that they have breast cancer. But because you know the first statistic, that only 1.4 women under 40 actually get breast cancer, you know that 8.6 million of the women who tested positive are not actually going to have breast cancer!
That’s a lot of needless worrying, which leads to a lot of needless medical care. In order to remedy this poor understanding and make better decisions about using mammograms, we absolutely must consider prior knowledge when we look at the results, and try to update our beliefs with that knowledge in mind.

Weigh the Evidence

Often we ignore prior information, simply called “priors” in Bayesian-speak. We can blame this habit in part on the availability heuristic—we focus on what’s readily available. In this case, we focus on the newest information and the bigger picture gets lost. We fail to adjust the probability of old information to reflect what we have learned.

The big idea behind Bayes’ theorem is that we must continuously update our probability estimates on an as-needed basis. In their book The Signal and the Noise, Nate Silver and Allen Lane give a contemporary example, reminding us that new information is often most useful when we put it in the larger context of what we already know:

Bayes’ theorem is an important reality check on our efforts to forecast the future. How, for instance, should we reconcile a large body of theory and evidence predicting global warming with the fact that there has been no warming trend over the last decade or so? Skeptics react with glee, while true believers dismiss the new information.

A better response is to use Bayes’ theorem: the lack of recent warming is evidence against recent global warming predictions, but it is weak evidence. This is because there is enough variability in global temperatures to make such an outcome unsurprising. The new information should reduce our confidence in our models of global warming—but only a little.

The same approach can be used in anything from an economic forecast to a hand of poker, and while Bayes’ theorem can be a formal affair, Bayesian reasoning also works as a rule of thumb. We tend to either dismiss new evidence, or embrace it as though nothing else matters. Bayesians try to weigh both the old hypothesis and the new evidence in a sensible way.

Limitations of the Bayesian

Don’t walk away thinking the Bayesian approach will enable you to predict everything! In addition to seeing the world as an ever-shifting array of probabilities, we must also remember the limitations of inductive reasoning. A high probability of something being true is not the same as saying it is true. A great example of this is from Bertrand Russell’s The Problems of Philosophy:

A horse which has been often driven along a certain road resists the attempt to drive him in a different direction. Domestic animals expect food when they see the person who usually feeds them. We know that all these rather crude expectations of uniformity are liable to be misleading. The man who has fed the chicken every day throughout its life at last wrings its neck instead, showing that more refined views as to the uniformity of nature would have been useful to the chicken.

In the final analysis, though, picking up Bayesian reasoning can truly change your life, as observed in this Big Think video by Julia Galef of the Center for Applied Rationality:

After you’ve been steeped in Bayes’ rule for a little while, it starts to produce some fundamental changes to your thinking. For example, you become much more aware that your beliefs are grayscale. They’re not black and white and that you have levels of confidence in your beliefs about how the world works that are less than 100 percent but greater than zero percent and even more importantly as you go through the world and encounter new ideas and new evidence, that level of confidence fluctuates, as you encounter evidence for and against your beliefs.

So be okay with uncertainty, and use it to your advantage. Instead of holding on to outdated beliefs by rejecting new information, take in what comes your way through a system of evaluating probabilities.

Bayes’ Theorem is part of the Farnam Street latticework of mental models. Still Curious? Read Bayes and Deadweight: Using Statistics to Eject the Deadweight From Your Life next. 

Learning community members can discuss this on the member forum

The Disproportional Power of Anecdotes

Humans, it seems, have an innate tendency to overgeneralize from small samples. How many times have you been caught in an argument where the only proof offered is anecdotal? Perhaps your co-worker saw this bratty kid make a mess in the grocery store while the parents appeared to do nothing. “They just let that child pull things off the shelves and create havoc! My parents would never have allowed that. Parents are so permissive now.” Hmm. Is it true that most parents commonly allow young children to cause trouble in public? It would be a mistake to assume so based on the evidence presented, but a lot of us would go with it anyway. Your co-worker did.

Our propensity to confuse the “now” with “what always is,” as if the immediate world before our eyes consistently represents the entire universe, leads us to bad conclusions and bad decisions. We don’t bother asking questions and verifying validity. So we make mistakes and allow ourselves to be easily manipulated.

Political polling is a good example. It’s actually really hard to design and conduct a good poll. Matthew Mendelsohn and Jason Brent, in their article “Understanding Polling Methodology,” say:

Public opinion cannot be understood by using only a single question asked at a single moment. It is necessary to measure public opinion along several different dimensions, to review results based on a variety of different wordings, and to verify findings on the basis of repetition. Any one result is filled with potential error and represents one possible estimation of the state of public opinion.

This makes sense. But it’s amazing how often we forget.

We see a headline screaming out about the state of affairs and we dive right in, instant believers, without pausing to question the validity of the methodology. How many people did they sample? How did they select them? Most polling aims for random sampling, but there is pre-selection at work immediately, depending on the medium the pollsters use to reach people.

Truly random samples of people are hard to come by. In order to poll people, you have to be able to reach them. The more complicated this is, the more expensive the poll becomes, which acts as a deterrent to thoroughness. The internet can offer high accessibility for a relatively low cost, but it’s a lot harder to verify the integrity of the demographics. And if you go the telephone route, as a lot of polling does, are you already distorting the true randomness of your sample size? Are the people who answer “unknown” numbers already different from those who ignore them?

Polls are meant to generalize larger patterns of behavior based on small samples. You need to put a lot of effort in to make sure that sample is truly representative of the population you are trying to generalize about. Otherwise, erroneous information is presented as truth.

Why does this matter?

It matters because generalization is a widespread human bias, which means a lot of our understanding of the world actually is based on extrapolations made from relatively small sample sizes. Consequently, our individual behavior is shaped by potentially incomplete or inadequate facts that we use to make the decisions that are meant to lead us to success. This bias also shapes a fair degree of public policy and government legislation. We don’t want people who make decisions that affect millions to be dependent on captivating bullshit. (A further concern is that once you are invested, other biases kick in).

Some really smart people are perpetual victims of the problem.

Joseph Henrich, Steven J. Heine, and Ara Norenzayan wrote an article called “The weirdest people in the world?” It’s about how many scientific psychology studies use college students who are predominantly Western, Educated, Industrialized, Rich, and Democratic (WEIRD), and then draw conclusions about the entire human race from these outliers. They reviewed scientific literature from domains such as “visual perception, fairness, cooperation, spatial reasoning, categorization and inferential induction, moral reasoning, and the heritability of IQ. The findings suggest that members of WEIRD societies, including young children, are among the least representative populations one could find for generalizing about humans.”

Uh-oh. This is a double whammy. “It’s not merely that researchers frequently make generalizations from a narrow subpopulation. The concern is that this particular subpopulation is highly unrepresentative of the species.”

This is why it can be dangerous to make major life decisions based on small samples, like anecdotes or a one-off experience. The small sample may be an outlier in the greater range of possibilities. You could be correcting for a problem that doesn’t exist or investing in an opportunity that isn’t there.

This tendency of mistaken extrapolation from small samples can have profound consequences.

Are you a fan of the San Francisco 49ers? They exist, in part, because of our tendency to over-generalize. In the 19th century in Western America and Canada, a few findings of gold along some creek beds led to a massive rush as entire populations flocked to these regions in the hope of getting rich. San Francisco grew from 200 residents in 1846 to about 36,000 only six years later. The gold rush provided enormous impetus toward California becoming a state, and the corresponding infrastructure developments touched off momentum that long outlasted the mining of gold.

But for most of the actual rushers, those hoping for gold based on the anecdotes that floated east, there wasn’t much to show for their decision to head west. The Canadian Encyclopedia states, “If the nearly 29 million (figure unadjusted) in gold that was recovered during the heady years of 1897 to 1899 [in the Klondike] was divided equally among all those who participated in the gold rush, the amount would fall far short of the total they had invested in time and money.”

How did this happen? Because those miners took anecdotes as being representative of a broader reality. Quite literally, they learned mining from rumor, and didn’t develop any real knowledge. Most people fought for claims along the creeks, where easy gold had been discovered, while rejecting the bench claims on the hillsides above, which often had just as much gold.

You may be thinking that these men must have been desperate if they packed themselves up, heading into unknown territory, facing multiple dangers along the way, to chase a dream of easy money. But most of us aren’t that different. How many times have you invested in a “hot stock” on a tip from one person, only to have the company go under within a year? Ultimately, the smaller the sample size, the greater role the factors of chance play in determining an outcome.

If you want to limit the capriciousness of chance in your quest for success, increase your sample size when making decisions. You need enough information to be able to plot the range of possibilities, identify the outliers, and define the average.

So next time you hear the words “the polls say,” “studies show,” or “you should buy this,” ask questions before you take action. Think about the population that is actually being represented before you start modifying your understanding. Accept the limits of small sample sizes from large populations. And don’t give power to anecdotes.