Tag: Mental Model

Mental Model: Equilibrium

There are many ways in which you can visualize the concept of equilibrium, but one of the simplest comes from Boombustology where a ball sits on a simple curved shape.

equilibrium
Source: Boombustology

A situation in which equilibrium is possible is one in which over time, if left to its own devices, the ball will find one unique location. Overshooting and undershooting this unique location is self-correcting. A situation of disequilibrium, however, is one in which the ball is unable to find a unique location. A ball in such a state does not generate self-correcting moves that dampen its moves toward a theoretical “equilibrium” or resting spot; rather, disequilibrium generates motion that is self-reinforcing and accelerates the ball’s move away from any stable state.

Let’s take a step back and thank Netwon.

In Principia, he describes his three laws of motion. Using planets, these laws allowed Newton to demonstrate how gravitational forces act between two bodies. He showed that the force of the sun’s gravity (pulling planets toward the sun) is offset by their forward velocity. These two forces, equal in nature, create a state of equilibrium.

Equilibrium is a balance between one or more opposing forces. As you can imagine, different types of equilibrium exist. Static equilibrium is when a system is at rest. Dynamic equilibrium is when two or more forces are equally matched. Robert Hagstorm, in the Last Liberal Art, helps illustrate the difference between the two:

A scale that is equally weighted on both sides is an example of static equilibrium. Fill a bathtub full of water and then turn off the faucet and you will observe static equilibrium. But if you unplug the drain and then turn on the faucet so the level of the bathtub does not change, you are witnessing dynamic equilibrium. Another example is the human body. It remains in dynamic equilibrium so long as the heat loss from cooling remains in balance with the consumption of sugars.

Supply and Demand + Equilibrium

The rule of supply and demand, from economics, is also an example of the law of equilibrium.

In 1997 Warren Buffett, through his company Berkshire Hathaway, purchased 11.2 million ounces of silver-based on his understanding of equilibrium. In his annual letter for that year, he succinctly sums up the investment:

In recent years, bullion inventories have fallen materially, and last summer Charlie (Munger) and I concluded that a higher price would be needed to establish equilibrium between supply and demand.

Too little supply and equilibrium is out of balance. Buffett (correctly) bet that the only way to bring the market back into a state of equilibrium was rising prices. Demand, the balancing force to supply, can also result in successful investments.

In his 2011 shareholder letter, Buffett again illustrates the concept of equilibrium through supply and demand.

Today the world’s gold stock is about 170,000 metric tons. If all of this gold were melded together, it would form a cube of about 68 feet per side. (Picture it fitting comfortably within a baseball infield.) At $1,750 per ounce – gold’s price as I write this – its value would be $9.6 trillion. Call this cube pile A. Let’s now create a pile B costing an equal amount. For that, we could buy all U.S. cropland (400 million acres with output of about $200 billion annually), plus 16 Exxon Mobils (the world’s most profitable company, one earning more than $40 billion annually). After these purchases, we would have about $1 trillion left over for walking-around money (no sense feeling strapped after this buying binge). Can you imagine an investor with $9.6 trillion selecting pile A over pile B?

Beyond the staggering valuation given the existing stock of gold, current prices make today’s annual production of gold command about $160 billion. Buyers – whether jewelry and industrial users, frightened individuals, or speculators – must continually absorb this additional supply to merely maintain an equilibrium at present prices.

A century from now the 400 million acres of farmland will have produced staggering amounts of corn, wheat, cotton, and other crops – and will continue to produce that valuable bounty, whatever the currency may be. Exxon Mobil will probably have delivered trillions of dollars in dividends to its owners and will also hold assets worth many more trillions (and, remember, you get 16 Exxons). The 170,000 tons of gold will be unchanged in size and still incapable of producing anything. You can fondle the cube, but it will not respond.

Admittedly, when people a century from now are fearful, it’s likely many will still rush to gold. I’m confident, however, that the $9.6 trillion current valuation of pile A will compound over the century at a rate far inferior to that achieved by pile B.

In Boombustology, Mansharamani writes:

Inherent in most equilibrium-oriented approaches is a belief that higher prices generate new supply that tends to push prices down. Likewise, it is believed that lower prices generate new demand that tends to push prices up. In this way, deviations from an appropriate price level are self-correcting.

A grasp of supply and demand can help us make better investment decisions. The producers of undifferentiated goods, (e.g., an aluminium can), are (usually) poor investments because the only way they will make adequate returns is under conditions of tight supply. If any excess capacity exists in the industry, prices will trend down towards the cost of producing. In this case, owners are left with unsatisfactory returns on their investment.

The only real winners are the low cost producers. As prices trend down only they can maintain full production whereas high cost competitors must cut production, which starts reducing supply and moves the industry towards equilibrium. When business picks up again, as it inevitably does, the production that was once shuttled comes back online. Only low-cost producers can operate throughout the cycle. Opportunities to profit from equilibrium exist when demand outstrips capacity, which usually results from (1) a positive change in demand or (2) a negative change in supply.

While seductively simple, this model of equilibrium in financial markets is somewhat incomplete. We must consider reflexivity.

George Soros writes, “Reflexivity is, in effect, a two-way feedback mechanism in which reality helps shape the participants’ thinking and the participants’ thinking helps shape reality in an unending process in which thinking and reality may come to approach each other but can never become identical.”

The implications of reflexivity on financial markets are quite profound, particularly with regard to the existence of an equilibrium price. Soros describes these implications in his own words succinctly:

Instead of a tendency towards some kind of theoretical equilibrium, the participants’ views and actual state of affairs enter into a process of dynamic disequilibrium, which may be self-reinforcing at first, moving both thinking and reality in a certain direction, but is bound to become unsustainable in the long run and engender a move in the opposite direction.

Soros’ testimony in 1994 to the House Banking Committee summarizes his theory of reflexivity and how it manifests itself in financial markets:

I must state at the outset that I am in fundamental disagreement with the prevailing wisdom. The generally accepted theory is that markets tend towards equilibrium and on the whole discount the future correctly. I operate using a different theory, according to which financial markets cannot possibly discount the future correctly because they do not merely discount the future, they help to shape it. In certain circumstances, financial markets can affect the so-called fundamentals which they are supposed to reflect. When that happens, markets enter into a state of dynamic disequilibrium and behave quite differently than what would be considered normal by the theory of efficient markets. Such boom/bust sequences do not arise very often, but when they do, they can be very disruptive, precisely because they affect the fundamentals of the economy.

In Boombustology, Mansharamani writes:

… financial extremes are characterized by two primary components: a prevailing trend that exists in reality and a misconception relating to it. He often uses real estate as an example to illustrate this point. The prevailing trend in reality is that there is an increased willingness to lend and a corresponding rise in prices. The misconception relating to this trend is that the prices of real estate are independent of the willingness to lend. Further, as more banks become willing to lend, and the number of buyers therefore rises, the prices of real estate rise—thereby making the banks feel more secure (given higher collateral values) and driving more lending.

Feedback Loops and Equilibrium

In Universal Principles of Design, William Lidwell & co. write:

Every action creates an equal and opposite reaction. When reactions loop back to affect themselves, a feedback loop is created. All real-world systems are composed of many such interacting feedback loops — animals, machines, businesses, and ecosystems, to name a few. There are two types of feedback loops: positive and negative. Positive feedback amplifies system output, resulting in growth or decline. Negative feedback dampers output, stabilizes the system around an equilibrium point.

Positive feedback loops are effective for creating change, but generally result in negative consequences if not moderated by negative feedback loops. For example, in response to head and neck injuries in football in the late 1950s, designers created plastic football helmets with internal padding to replace leather helmets. The helmets provided more protection, but induced players to take increasingly greater risks when tackling. More head and neck injuries occurred (after the introduction of plastic helmets) than before. By concentrating on the problem in isolation (e.g., not considering changes in player behavior designers inadvertently created a positive feedback loop in which players used their head and neck in increasingly risky ways. This resulted in more injuries which resulted in additional redesigns that made the helmet shells harder and more padded and so on.

Negative feedback loops are effective for resisting change. For example, the Segway Human Transported uses negative feedback lops to maintain equilibrium. As a rider leans forward or backward, the Segway accelerates or decelerates to keep the system in equilibrium. To achieve this smoothly, the Segway makes hundreds of adjustments every second. Given the high adjustment rate, the oscillations around the point of equilibrium are so small as to not be detectable. However, if fewer adjustments were made per second, the oscillations would increase in size and the ride would become increasingly jerky.

Diseases and Equilibrium

Malcolm Gladwell illustrates this in The Tipping Point with a hypothetical outbreak of the flu.

Suppose, for example, that one summer 1,000 tourists come to Manhattan from Canada carrying an untreatable strain of twenty-four-hour virus. This strain of flu has a 2 percent infection rate, which is to say that one out of every 50 people who come into close contact with someone carrying it catches the bug himself. Let’s say that 50 is also exactly the number of people the average Manhattanite — in the course of riding the subways and mingling with colleagues at work — comes into contact with every day. What we have, then, is a disease in equilibrium. Those 1,000 Canadian tourists pass on the virus to 1,000 new people on the day they arrive. And the next day those 1,000 newly infected people pass on the virus to another 1,000 people, just as the original 1,000 tourists who started the epidemic are returning to health. With those getting sick and those getting well so perfectly in balance, the flu chugs along at a steady but unspectacular clip through the test of summer and fall.

But then comes the Christmas season. The subways and buses get more crowded with tourists and shoppers, and instead of running into an even 50 people a day, the average Manhattanite now has close contact with, say, 55 people a day. All of a sudden, the equilibrium is disrupted. The 1,000 flu carriers now run into 55,000 people a day and at a 2 percent infection rate, that translates into 1,100 cases the following day. Those 1,100, in turn, are now passing on their virus to 55,000 people as well, so that by day three there are 1,210 Manhattanites with the flu and by day four 1,331 and by the end of the week there are nearly 2,000, and so on up, in an exponential spiral until Manhattan has a full-blow flu epidemic on its hands by Christmas Day. That moment when the average flu carrier went from running into 50 people a day to running into 55 was the Tipping point. It was the point at which an ordinary and stable phenomenon — a low-level flu outbreak — turned into a public health crisis. If you were to draw a graph of the progress of the Canadian flu epidemic, the Tipping point would be the point on the graph where is suddenly turned upward.

The Equilibrium is a part of the Farnam Street latticework of Mental Models.

Metaphors

For most people a metaphor is a matter of extraordinary rather than ordinary language. “For this reason,” write Mark Johnson and George Lakoff in their book Metaphors We Live By, “most people think they can get along perfectly well without a metaphor.”

We have found, on the contrary, that metaphor is pervasive in everyday life, not just in language but in thought and action. Our ordinary conceptual system, in terms of which we both think and act, is fundamentally metaphorical in nature.

Metaphor

What governs our thought governs our functioning. “Our concepts (even something as simple as the word we use) structure what we perceive, how we get around in the world, and how we relate to other people.”

Since communication is based on the same conceptual system that we use in thinking and acting, language is an important source of evidence for what the system is like.

Most of our ordinary conceptual system is metaphorical in nature.

To give some idea of what it could mean for a concept to be metaphorical and for such a concept to structure an everyday activity, let us start with the concept ARGUMENT and the conceptual metaphor ARGUMENT IS WAR. This metaphor is reflected in our everyday language by a wide variety of expressions:

ARGUMENT IS WAR

Your claims are indefensible.
He attacked every weak point in my argument.
His criticisms were right on target.
I demolished his argument.
I’ve never won an argument with him.
You disagree? Okay, shoot!
If you use that strategy, he’ll wipe you out.
He shot down all of my arguments.

It is important to see that we don’t just talk about arguments in terms of war. We can actually win or lose arguments. We see the person we are arguing with as an opponent. We attack his positions and we defend our own. We gain and lose ground. We plan and use strategies. If we find a position indefensible, we can abandon it and take a new line of attack. Many of the things we do in arguing are partially structured by the concept of war. Though there is no physical battle, there is a verbal battle, and the structure of an argument—attack, defense, counter-attack, etc.—reflects this. It is in this sense that the ARGUMENT IS WAR metaphor is one that we live by in this culture; its structures the actions we perform in arguing.

Try to imagine a culture where arguments are not viewed in terms of war, where no one wins or loses, where there is no sense of attacking or defending, gaining or losing ground. Imagine a culture where an argument is viewed as a dance, the participants are seen as performers, and the goal is to perform in a balanced and aesthetically pleasing way. In such a culture, people would view arguments differently, experience them differently, carry them out differently, and talk about them differently. But we would probably not view them as arguing at all: they would simply be doing something different. It would seem strange even to call what they were doing “arguing.” In perhaps the most neutral way of describing this difference between their culture and ours would be to say that we have a discourse form structured in terms of battle and they have one structured in terms of dance.

This is an example of what it means for a metaphorical concept, namely, ARGUMENT IS WAR, to structure (at least in part) what we do and how we understand what we are doing when we argue. The essence of metaphor is understanding and experiencing one kind of thing in terms of another. It is not that arguments are a subspecies of war. Arguments and wars are different kinds of things–verbal discourse and armed conflict–and the actions performed are different kinds of actions. But ARGUMENT is partially structured, understood, performed, and talked about in terms of WAR. The concept is metaphorically structured, the activity is metaphorically structured, and, consequently, the language is metaphorically structured.

In the Investing: The Last Liberal Art, Robert Hagstrom writes:

At the simplest level, a metaphor is a way to convey meaning using out-of-ordinary, nonliteral language. When we say that “work was a living hell,” we don’t really mean to say that we spent the day beating back fire and shoveling ashes, but rather we want to communicate, in no uncertain terms, that it was a hard day at the office. Used this way, a metaphor is a concise, memorable, and often colorful way to express emotions. In a deeper sense, metaphors represent not only language but also thought and action.

Metaphors are much more than a poetic imagination or rhetorical flourish. They can help us translate ideas into mental models and those models form the basis of worldly wisdom.

Many people contend that metaphors are necessary to stimulate new ideas. Hagstrom continues:

In the same way that a metaphor helps communicate one concept by comparing it to another concept that is widely understood, using a simple model to describe one idea can help us grasp the complexities of a similar idea. In both cases we are using one concept (the source) to better understand another (the target). Used this way, metaphors not only express existing ideas, they stimulate new ones.

Insensitivity To Base Rates: An Introduction

In statistics, a base rate refers to the percentage of a population (e.g. grasshoppers, people who live in New York, newborn babies) which have a characteristic. Given a random individual and no additional information, the base rate tells us the likelihood of them exhibiting that characteristic. For instance, around 10% of people are left-handed. If you selected a random person and had no information related to their handedness, you could safely guess there to be a 1 in 10 chance of them being left-handed.

When we make estimations, we often fail to consider the influence of base rates. This is a common psychological bias and is related to the representativeness heuristic.

From Smart Choices: A Practical Guide to Making Better Decisions:

Donald Jones is either a librarian or a salesman. His personality can best be described as retiring. What are the odds that he is a librarian?

When we use this little problem in seminars, the typical response goes something like this: “Oh, it’s pretty clear that he’s a librarian. It’s much more likely that a librarian will be retiring; salesmen usually have outgoing personalities. The odds that he’s a librarian must be at least 90 percent.” Sounds good, but it’s totally wrong.

The trouble with this logic is that it neglects to consider that there are far more salesmen than male librarians. In fact, in the United States, salesmen outnumber male librarians 100 to 1. Before you even considered the fact that Donald Jones is “retiring,” therefore, you should have assigned only a 1 percent chance that Jones is a librarian. That is the base rate.

Now, consider the characteristic “retiring.” Suppose half of all male librarians are retiring, whereas only 5 percent of salesmen are. That works out to 10 retiring salesmen for every retiring librarian — making the odds that Jones is a librarian closer to 10 percent than to 90 percent. Ignoring the base rate can lead you wildly astray.

* * *

Charlie Munger, instructs us how to think about base rates with an example of an employee who got caught for stealing, claiming she’s never done it before and will never do it again:

You find an isolated example of a little old lady in the See’s Candy Company, one of our subsidiaries, getting into the till. And what does she say? “I never did it before, I’ll never do it again. This is going to ruin my life. Please help me.” And you know her children and her friends, and she’d been around 30 years and standing behind the candy counter with swollen ankles. When you’re an old lady it isn’t that glorious a life. And you’re rich and powerful and there she is: “I never did it before, I’ll never do it again.” Well how likely is it that she never did it before? If you’re going to catch 10 embezzlements a year, what are the chances that any one of them — applying what Tversky and Kahneman called base rate information — will be somebody who only did it this once? And the people who have done it before and are going to do it again, what are they all going to say? Well in the history of the See’s Candy Company they always say, “I never did it before, and I’m never going to do it again.” And we cashier them. It would be evil not to, because terrible behavior spreads (Greshams law).

* * *

Max Bazerman, in Judgment in Managerial Decision Making, writes:

(Our tendency to ignore base rates) is even stronger when the specific information is vivid and compelling, as Kahneman and Tversky illustrated in one study from 1972. Participants were given a brief description of a person who enjoyed puzzles and was both mathematically inclined and introverted. Some participants were told that this description was selected from a set of seventy engineers and thirty lawyers. Others were told that the description came from a list of thirty engineers and seventy lawyers. Next, participants were asked to estimate the probability that the person described was an engineer. Even though people admitted that the brief description did not offer a foolproof means of distinguishing lawyers from engineers, most tended to believe the description was of an engineer. Their assessments were relatively impervious to differences in base rates of engineers (70 percent versus 30 percent of the sample group.)

Participants do use base-rate data correctly when no other information is provided. In the absence of a personal description, people use the base rates sensibly and believe that a person picked at random from a group made up mostly of lawyers is most likely to be a lawyer. Thus, people understand the relevance of base-rate information, but tend to disregard such data when individuating data are also available.

Ignoring base rates has many unfortunate implications. … Similarly, unnecessary emotional distress is caused in the divorce process because of the failure of couples to create prenuptial agreements that facilitate the peaceful resolution of a marriage. The suggestion of a prenuptial agreement is often viewed as a sign of bad faith. However, in far too many cases, the failure to create prenuptial agreements occurs when individuals approach marriage with the false belief that the high base rate for divorce does not apply to them.

* * *

Of course, this applies to investing as well. This conversation with Sanjay Bakshi speaks to this:

One of the great lessons from studying history is to do with “base rates”. “Base rate” is a technical term of describing odds in terms of prior probabilities. The base rate of having a drunken-driving accident is higher than those of having accidents in a sober state.

So, what’s the base rate of investing in IPOs? When you buy a stock in an IPO, and if you flip it, you make money if it’s a hot IPO. If it’s not a hot IPO, you lose money. But what’s the base rate – the averaged out experience – the prior probability of the activity of subscribing for IPOs – in the long run?

If you do that calculation, you’ll find that the base rate of IPO investing (in fact, it’s not even investing … it’s speculating) sucks! [T]hat’s the case, not just in India, but in every market, in different time periods.

[…]

When you evaluate whether smoking is good for you or not, if you look at the average experience of 1,000 smokers and compare them with a 1,000 non-smokers, you’ll see what happens.

People don’t do that. They get influenced by individual stories like a smoker who lived till he was 95. Such a smoker will force many people to ignore base rates, and to focus on his story, to fool themselves into believing that smoking can’t be all that bad for them.

What is the base rate of investing in leveraged companies in bull markets?

[…]

This is what you learn by studying history. You know that the base rate of investing in an airline business sucks. There’s this famous joke about how to become a millionaire. You start with a billion, and then you buy an airline. That applies very well in this business. It applies in so many other businesses.

Take the paper industry as an example. Averaged out returns on capital for paper industry are bad for pretty good reasons. You are selling a commodity. It’s an extremely capital intensive business. There’s a lot of over-capacity. And if you understand microeconomics, you really are a price taker. There’s no pricing power for you. Extreme competition in such an environment is going to cause your returns on capital to be below what you would want to have.

It’s not hard to figure this out (although I took a while to figure it out myself). Look at the track record of paper companies around the world, and the airline companies around the world, or the IPOs around the world, or the textile companies around the world. Sure, there’ll be exceptions. But we need to focus on the average experience and not the exceptional ones. The metaphor I like to use here is that of a pond. You are the fisherman. If you want to catch a lot of fish, then you must go to a pond where there’s a lot of fish. You don’t want to go to fish in a pond where there’s very little fish. You may be a great fisherman, but unless you go to a pond where there’s a lot of fish, you are not going to find a lot of fish.

[…]

So one of the great lessons from studying history is to see what has really worked well and what has turned out to be a disaster – and to learn from both.

***

Bias from Insensitivity To Base Rates is part of the Farnam Street Latticework of Mental Models.

Mental Model: Game Theory

From Game Theory, by Morton Davis:

The theory of games is a theory of decision making. It considers how one should make decisions and to a lesser extent, how one does make them. You make a number of decisions every day. Some involve deep thought, while others are almost automatic. Your decisions are linked to your goals—if you know the consequences of each of your options, the solution is easy. Decide where you want to be and choose the path that takes you there. When you enter an elevator with a particular floor in mind (your goal), you push the button (one of your choices) that corresponds to your floor. Building a bridge involves more complex decisions but, to a competent engineer, is no different in principle. The engineer calculates the greatest load the bridge is expected to bear and designs a bridge to withstand it. When chance plays a role, however, decisions are harder to make. … Game theory was designed as a decision-making tool to be used in more complex situations, situations in which chance and your choice are not the only factors operating. … (Game theory problems) differ from the problems described earlier—building a bridge and installing telephones—in one essential respect: While decision makers are trying to manipulate their environment, their environment is trying to manipulate them. A store owner who lowers her price to gain a larger share of the market must know that her competitors will react in kind. … Because everyone’s strategy affects the outcome, a player must worry about what everyone else does and knows that everyone else is worrying about him or her.

What is a game? From Game Theory and Strategy:

Game theory is the logical analysis of situations of conflict and cooperation. More specifically, a game is defined to be any situation in which:

  1. There are at least two players. A player may be an individual, but it may also be a more general entity like a company, a nation, or even a biological species.
  2. Each player has a number of possible strategies, courses of action which he or she may choose to follow.
  3. The strategies chosen by each player determine the outcome of the game.
  4. Associated to each possible outcome of the game is a collection of numerical payoffs, one to each player. These payoffs represent the value of the outcome to the different players.

…Game theory is the study of how players should rationally play games. Each player would like the game to end in an outcome which gives him as large a payoff as possible.

From Greg Mankiw’s Economics textbook:

Game theory is the study of how people behave in strategic situations. By ‘strategic’ we mane a situation in which each person, when deciding what actions to take, must consider how others might respond to that action. Because the number of firms in an oligopolistic market is small, each firm must act strategically. Each firm knows that its profit depends not only on how much it produces but also on how much the other firms produce. In making its production decision, each firm in an oligopoly should consider how its decision might affect the production decisions of all other firms.

Game theory is not necessary for understanding competitive or monopoly markets. In a competitive market, each firm is so small compared to the market that strategic interactions with other firms are not important. In a monopolized market, strategic interactions are absent because the market has only one firm. But, as we will see, game theory is quite useful for understanding the behavior of oligopolies.

A particularly important ‘game’ is called the prisoners’ dilemma.

Markets with only a few sellers

Because an oligopolistic market has only a small group of sellers, a key feature of oligopoly is the tension between cooperation and self-interest. The oligopolists are best off when they cooperate and act like a monopolist – producing a small quantity of output and charging a price above marginal cost. Yet because each oligopolist cares only about its own profit, there are powerful incentives at work that hinder a group of firms from maintaining the cooperative outcome.

Avinash Dixit and Barry Nalebuff, in their book “Thinking Strategically” offer:

Everyone’s best choice depends on what others are going to do, whether it’s going to war or maneuvering in a traffic jam.

These situations, in which people’s choices depend on the behavior or the choices of other people, are the ones that usually don’t permit any simple summation. Rather we have to look at the system of interaction.

Michael J. Mauboussin relates game theory to firm interaction

How a firm interacts with other firms plays an important role in shaping sustainable value creation. Here we not only consider how many companies interact with their competitors, but how companies can co-evolve.

Game Theory is one of the best tools to understand interaction. Game Theory forces managers to put themselves in the shoes of other players rather than viewing games solely from their own perspective.

The classic two-player example of game theory is the prisoners’ dilemma.

Game Theory is part of the Farnam Street latticework of Mental Models. See all posts on game theory.

The Red Queen Effect: Avoid Running Faster and Faster Only to Stay in the Same Place

Charles Lutwidge Dodgson (1832-1898), better known by his pseudonym Lewis Carroll, was not only an author but a keen observer of human nature. His most famous works are Alice’s Adventures in Wonderland and its sequel Through the Looking Glasswhich have become timeless classics.

“Bees have to move very fast to stay still.”

— David Foster Wallace

In Through the Looking Glass, Alice, a young girl, gets schooled by the Red Queen in an important life lesson that many of us fail to heed. Alice finds herself running faster and faster but saying in the same place.

Alice never could quite make out, in thinking it over afterwards, how it was that they began: all she remembers is, that they were running hand in hand, and the Queen went so fast that it was all she could do to keep up with her: and still the Queen kept crying ‘Faster! Faster!’ but Alice felt she could not go faster, though she had not breath left to say so.

The most curious part of the thing was, that the trees and the other things round them never changed their places at all: however fast they went, they never seemed to pass anything. ‘I wonder if all the things move along with us?’ thought poor puzzled Alice. And the Queen seemed to guess her thoughts, for she cried, ‘Faster! Don’t try to talk!’

Eventually, the Queen stops running and props Alice up against a tree, telling her to rest.

Alice looked round her in great surprise. ‘Why, I do believe we’ve been under this tree the whole time! Everything’s just as it was!’

‘Of course it is,’ said the Queen, ‘what would you have it?’

‘Well, in our country,’ said Alice, still panting a little, ‘you’d generally get to somewhere else — if you ran very fast for a long time, as we’ve been doing.’

‘A slow sort of country!’ said the Queen. ‘Now, here, you see, it takes all the running you can do, to keep in the same place.

If you want to get somewhere else, you must run at least twice as fast as that!’

“It is not the strongest of the species that survives,
nor the most intelligent,
but the one most responsive to change.”

— Charles Darwin

Smarter, Not Harder

The Red Queen Effect means we can’t be complacent or we’ll fall behind. To survive another day we have to run very fast and hard, we need to co-evolve with the systems we interact with.

If all animals evolved at the same rate, there would be no change in the relative interactions between species. However, not all animals evolve at the same rate. As Darwin observed, some are more “responsive to change” than others. Species that are more responsive to change can gain a relative advantage over the ones they compete with and increase the odds of survival. In the short run, these small gains don’t make much of a difference, but as generations pass the advantage can compound. A compounding advantage… that sounds nice.

Everyone from Entrepreneurs and Fortune 500 CEOs to best-selling authors and middle managers is embedded is in their own Red Queen. Rather than run harder, wouldn’t it be nice to run smarter?

Here are just three of the ways we try to avoid the Red Queen.

  1. We invest significantly in new product development and content. Our courses, evolve quickly incorporating student-tested concepts that work and reducing the importance of the ones that don’t. Another example, our learning community, adds real-world value to people who make decisions by discussing time-tested principles. This is not a popular path as it’s incredibly expensive in time and money. Standing still, however, is more expensive. We’re not in the business of Edutainment but rather providing better outcomes. If we fail to keep getting better, we won’t exist.
  2. We try to spend our limited mental resources working on things that won’t change next week. We call these mental models and the ones we want to focus on are the ones that stand the test of time.
  3. We recognize how the world works and not how we want it to work. When the world isn’t working the way we’d like it to, it’s easy to say the world is wrong and sit back to see what happens. You know what happens right? You fall behind and it’s even harder to catch up. It’s like you’re on a plane. When you’re flying into the wind you have to work very hard. When you’re flying with the wind at your back, you need to expend less energy and you get there earlier. Recognizing reality and adapting your behavior creates a tailwind.

More Examples of the Red Queen Effect

In Deep Simplicity, John Gribbon describes the red queen principle with frogs.

There are lots of ways in which the frogs, who want to eat flies, and the flies, who want to avoid being eaten, interact. Frogs might evolve longer tongues, for fly-catching purposes; flies might evolve faster flight, to escape. Flies might evolve an unpleasant taste, or even excrete poisons that damage the frogs, and so on. We’ll pick one possibility. If a frog has a particularly sticky tongue, it will find it easier to catch flies. But if flies have particularly slippery bodies, they will find it easier to escape, even if the tongue touches them. Imagine a stable situation in which a certain number of frogs live on a pond and eat a certain proportion of the flies around them each year.

Because of a mutation a frog developes an extra sticky tongue. It will do well, compared with other frogs, and genes for extra sticky tongues will spread through the frog population. At first, a larger proportion of flies gets eaten. But the ones who don’t get eaten will be the more slippery ones, so genes for extra slipperiness will spread through the fly population. After a while, there will be the same number of frogs on the pond as before, and the same proportion of flies will be eaten each year. It looks as if nothing has changed – but the frogs have got stickier tongues, and the flies have got more slippery bodies.

Drugs and disease also represent an “arms-race.”

Siddhartha Mukherjee, in his Pulitzer-prize winning book The Emperor of All Maladies describes this in the context of drugs and cancer.

In August 2000, Jerry Mayfield, a forty-one-year-old Louisiana policeman diagnosed with CML, began treatment with Gleevec. Mayfield’s cancer responded briskly at first. The fraction of leukemic cells in his bone marrow dropped over six months. His blood count normalized and his symptoms improved; he felt rejuvenated—“like a new man [on] a wonderful drug.” But the response was short-lived. In the winter of 2003, Mayfield’s CML stopped responding. Moshe Talpaz, the oncologist treating Mayfield in Houston, increased the dose of Gleevec, then increased it again, hoping to outpace the leukemia. But by October of that year, there was no response. Leukemia cells had fully recolonized his bone marrow and blood and invaded his spleen. Mayfield’s cancer had become resistant to targeted therapy…

… Even targeted therapy, then, was a cat-and-mouse game. One could direct endless arrows at the Achilles’ heel of cancer, but the disease might simply shift its foot, switching one vulnerability for another. We were locked in a perpetual battle with a volatile combatant. When CML cells kicked Gleevec away, only a different molecular variant would drive them down, and when they outgrew that drug, then we would need the next-generation drug. If the vigilance was dropped, even for a moment, then the weight of the battle would shift. In Lewis Carroll’s Through the Looking-Glass, the Red Queen tells Alice that the world keeps shifting so quickly under her feet that she has to keep running just to keep her position. This is our predicament with cancer: we are forced to keep running merely to keep still.

This doesn’t only happen in nature, there are many business examples as well. 

In describing the capital investment needed to maintain a relative placement in the textile industry, Warren Buffett writes:

Over the years, we had the option of making large capital expenditures in the textile operation that would have allowed us to somewhat reduce variable costs. Each proposal to do so looked like an immediate winner. Measured by standard return-on-investment tests, in fact, these proposals usually promised greater economic benefits than would have resulted from comparable expenditures in our highly-profitable candy and newspaper businesses.

But the promised benefits from these textile investments were illusory. Many of our competitors, both domestic and foreign, were stepping up to the same kind of expenditures and, once enough companies did so, their reduced costs became the baseline for reduced prices industrywide. Viewed individually, each company’s capital investment decision appeared cost-effective and rational; viewed collectively, the decisions neutralized each other and were irrational (just as happens when each person watching a parade decides he can see a little better if he stands on tiptoes). After each round of investment, all the players had more money in the game and returns remained anemic.

In other words, more and more money is needed just to maintain your relative position in the industry and stay in the game. This situation plays out over and over again and brings with it many ripple effects. For example, the company distracted by maintaining a relative position in a poor industry places resources in a position almost assured to get a poor return on capital.

Inflation also causes a Red Queen Effect, here’s Buffett Again:

Unfortunately, earnings reported in corporate financial statements are no longer the dominant variable that determines whether there are any real earnings for you, the owner. For only gains in purchasing power represent real earnings on investment. If you (a) forego ten hamburgers to purchase an investment; (b) receive dividends which, after tax, buy two hamburgers; and (c) receive, upon sale of your holdings, after-tax proceeds that will buy eight hamburgers, then (d) you have had no real income from your investment, no matter how much it appreciated in dollars. You may feel richer, but you won’t eat richer.

High rates of inflation create a tax on capital that makes much corporate investment unwise—at least if measured by the criterion of a positive real investment return to owners. This “hurdle rate” the return on equity that must be achieved by a corporation in order to produce any real return for its individual owners—has increased dramatically in recent years. The average tax-paying investor is now running up a down escalator whose pace has accelerated to the point where his upward progress is nil.

The Red Queen is part of the Farnam Street latticework of mental models.

Sources:
– The excellent Sanjay Bakshi
Wikipedia
Through the Looking Glass

The Feynman Technique: The Best Way to Learn Anything

If you’re after a way to supercharge your learning and become smarter, The Feynman Technique might just be the best way to learn absolutely anything.

In this post, we’ll look at the method Nobel prize-winning physicist Richard Feynman pioneered to ensure he understood anything he studied better than anyone else and which you can use to get ahead.

***

There are four steps to the Feynman Learning Technique:

  1. Choose a concept you want to learn about
  2. Pretend you are teaching it to a student in grade 6
  3. Identify gaps in your explanation;  Go back to the source material, to better understand it.
  4. Review and simplify (optional)

***

If you’re not learning, you’re standing still. But how do we get feedback on what we’re learning? And how do we go about learning new subjects and identifying gaps in our existing knowledge?

Two Types of Knowledge

Feynman understood the difference between knowing something and knowing the name of something, and it’s one of the most important reasons for his success. Most of us focus on the wrong type of knowledge. The first type of knowledge focuses on knowing the name of something — what it’s called. The second focuses on actually knowing something — that is understanding something.

“The person who says he knows what he thinks but cannot express it usually does not know what he thinks.”

— Mortimer Adler

The Feynman Technique

Step 1: Teach it to a child

Take out a blank sheet of paper. At the top write the subject you want to learn. Now write out everything you know about the subject you want to understand as if you were teaching it to a child. Not your smart adult friend, but rather a 12-year-old who has just enough vocabulary and attention span to understand basic concepts and relationships.

It turns out that one of the ways we trick ourselves is that we use complicated vocabulary and jargon and it masks our lack of understanding.

When you write out an idea from start to finish in simple language that a child can understand, you force yourself to understand the concept at a deeper level and simplify relationships and connections between ideas.

Some of this will be easy. These are the places where you have a clear understanding of the subject. At other points, you will struggle. These are the points where you have some gaps in your understanding.

Step 2: Review

Only when you encounter gaps in your knowledge—where you forget something important, are not able to explain it, or simply have trouble thinking of how variables interact—can you really start learning.

Now that you know where you got stuck, go back to the source material and re-learn it until you can explain it in basic terms. Only when you can explain your understanding without jargon and in simple terms can you demonstrate your understanding. This is the work required to learn, and skipping it leads to the illusion of knowledge.

Identifying the boundaries of your understanding also limits the mistakes you’re liable to make and increases your chance of success when applying knowledge.

Step 3: Organize and Simplify

Now you have a set of hand-crafted notes. Review them to make sure you didn’t mistakenly borrow any of the jargon from the source material. Organize them into a simple narrative that you can tell. Read it out loud. If the explanation isn’t simple or sounds confusing, that’s a good indication that your understanding in that area still needs some work.

If you follow this approach over and over, you will end up with a binder full of pages on different subjects. If you take some time twice a year to go through this binder, you will find just how much you retain.

Step 4 (Optional): Transmit

If you really want to be sure of your understanding, run it past someone (ideally who knows little of the subject –or find that 8-year-old!). The ultimate test of your knowledge is your capacity to convey it to another.

***

Not only is the Feynman Technique a wonderful recipe for learning, but it’s also a window into a different way of thinking that allows you to tear ideas apart and reconstruct them from the ground up.

When you’re having a conversation with someone and they start using words or relationships that you don’t understand, ask them to explain it to you like you’re 12.

Not only will you supercharge your own learning, but you’ll also supercharge theirs. Importantly, approaching problems in this way allows you to understand when others don’t know what they are talking about. (See Batesian Mimicry)

Feynman’s approach intuitively believes that intelligence is a process of growth, which dovetails nicely with the work of Carol Dweck, who beautifully describes the difference between a fixed and growth mindset.