It is a fundamental law of nature that to evolve one has to push one’s limits, which is painful, in order to gain strength—whether it’s in the form of lifting weights, facing problems head-on, or in any other way. Nature gave us pain as a messaging device to tell us that we are approaching, or that we have exceeded, our limits in some way. At the same time, nature made the process of getting stronger require us to push our limits. Gaining strength is the adaptation process of the body and the mind to encountering one’s limits, which is painful. In other words, both pain and strength typically result from encountering one’s barriers. When we encounter pain, we are at an important juncture in our decision-making process.
Most people react to pain badly. They have “fight or flight” reactions to it: they either strike out at whatever brought them the pain or they try to run away from it. As a result, they don’t learn to find ways around their barriers, so they encounter them over and over again and make little or no progress toward what they want.
“Big populations don’t go extinct. Small populations do.
It’s not a surprising finding but it is a significant one.”
Why do small populations go extinct?
While the answer is simple to outline the scientific details are more nuanced. For now, lets stick to the outline version.
“Small populations go extinct because (1) all populations fluctuate in size from time to time, under the influence of two kinds of factors, which ecologists refer to as deterministic and stochastic; and (2) small populations, unlike big ones, stand a good chance of fluctuation to zero, since zero is not far away.”
Deterministic factors are those involving straightforward cause-and-effect relations that to some extent can be predicted and controlled: hunting, trapping, destroying habitat, introducing new animals that compete with or prey on existing ones, etc.
Stochastic factors “operate in a realm beyond human prediction and control, either because they are truly random or because they are linked to geophysical or biological causes so obscurely complex that they seem random.” We’re talking things like weather patterns, epidemic disease, infestation of parasites, forest fires, etc. Each might cause a downward fluctuation in the population of some species.
In Song of the Dodo, David Quammen gives the following illuminating example.
Think of two species that live on the same tiny island. One is a mouse. Total population, ten thousand. The other is an owl. Total population, eighty. The owl is a fierce and proficient mouse eater. The mouse is timorous, fragile, easily victimized. But the mouse population as a collective entity enjoys the security of numbers.
Say that a three-year drought hits the island of owls and mice, followed by a lightning-set fire, accidental events that are hurtful to both species. The mouse population drops to five thousand, the owl population to forty. At the height of the next breeding season a typhoon strikes, raking the treetops and killing and entire generation of unfledged owls. Then a year passes peacefully, during which the owl and the mouse populations both remain steady, with attrition from old age and individual mishaps roughly offset by new births. Next, the mouse suffers an epidemic disease, cutting its population to a thousand, fewer than at any other time within decades. This extreme slump even affects the owl, which begins starving for lack of prey.
Weakened by hunger, the owl suffers its own epidemic, from a murderous virus. Only fourteen birds survive. Just six of those fourteen owls are female, and three of the six are too old to breed. Then a young female owl chokes to death on a mouse. That leaves two fertile females. One of them loses her next clutch of eggs to a snake. The other nests successfully and manages to fledge four young, all four of which happen to be male. The owl population is now depressed to a point of acute vulnerability. Two breeding females, a few older females, a dozen males. Collectively they possess insufficient genetic diversity for adjusting to further troubles, and there is a high chance of inbreeding between mothers and sons. The inbreeding, when it occurs, tends to yield some genetic defects. Meanwhile the mouse population is also depressed far below its original number.
Ten years pass, with the owl population becoming progressively less healthy because of inbreeding. A few further females are hatched, precious additions to the gender balance, though some of them turn out to be congenitally infertile. During that same stretch of time the mouse population rebounds vigorously. Good weather, plenty of food, no epidemics, genetically it’s fine—and so the mouse quickly returns to its former abundance.
Then another wildfire scorches the island, killing four adult owls, and, oh, six thousand mice. The four dead owls were all breeding-age females, crucial to the beleaguered population. The six thousand mice were demographically less crucial. Among the owls there now remains only one female who is young and fertile. She develops ovarian cancer, a problem to which she is susceptible because of the history of inbreeding among her ancestors. She dies without issue. Very bad news for the owl species. Let’s give the mouse another plague of woe, just to be fair: a respiratory infection, contagious and lethal, causes eight hundred fatalities. None of this is implausible. These things happen. The owl population—reduced to a dozen mopey males, several dowagers, no fertile females—is doomed to extinction. When the males and the dowagers die off, one by one, leaving not offspring, that’s that. The mouse population fluctuates upward in response to the extinction of the owls, a rude signal that life is easier in the absence of predation. Twelve thousand mice. Fifteen thousand. Twenty thousand. But while its numbers are so high it will probably overexploit its own resources and eventually decline again as a consequence of famine. Then rise again. Then decline again. Then …
The mouse population is a yo-yo on a long string. Despite all the accidental disasters, despite all the ups and downs, the mouse doesn’t go extinct because the mouse is not rare. The owl goes extinct. Why? Because life is a gauntlet of uncertainties and the owl’s population size, in the best of times, was too small to buffer it against the worst of times.
Still curious? Read The Song of the Dodo.
Daniel Lieberman, author of The Evolution of the Human Head, sat down with the NYT for an interesting conversation.
Some years ago, I was doing an experiment where I put pigs on treadmills. The goal was to learn how running stressed the bones in the head. One day, a colleague, Dennis Bramble, walked into the lab, watched what was going on, and declared, “You know, that pig can’t hold its head still!”
This was my “eureka!” moment. I’d observed pigs on treadmills for hundreds of hours and had never thought about this. So Dennis and I started talking about how, when these pigs ran, their heads bobbed every which way and how running humans are really adept at stabilizing their heads. We realized that there were special features in the human neck that enable us to keep our heads still. That gives us an evolutionary advantage because it helps us avoid falls and injuries. And this seemed like evidence of natural selection in our ability to run, an important factor in how we became hunters rather than just foragers and got access to richer foods, which fueled the evolution of our big brains.
So I got interested in how we developed these stable heads. I’m a runner myself. It’s always interesting to study one’s passion. By 2004, we’d found enough evidence to publish a paper in Nature where we declared, “Humans were born to run.” We cited the many dozens of adaptations in the human body that had made us into superlative endurance runners, even compared to dogs and horses.
Before bows and arrows and before horses were tamed, we did “persistence hunting” where we ran kudu, wildebeest and zebra into exhaustion. These animals can’t pant when they gallop. They overheat. People would find a big animal and chase it till it collapsed. You need no technology to do this, just the ability to run long distances, which all of us have.
You can see proof of this capability every November when 45,000 people run for many hours through the streets of New York.
In his treatise The Evolution of the Human Head, Daniel Lieberman sets out to explain how the human head works, and why our heads evolved in this peculiarly human way.
The Cockroach Papers by Richard Schwied is an interesting book if you are looking to learn more about biology or evolution. Cockroaches are built for survival no matter what the world throws at them. Their ability to adapt is just amazing.
Here are some of my notes from the book.
Food and Water
German cockroaches, Blattella germanica, the most common domestic roach in the United States, have been observed to live 45 days without food, and more than two weeks with neither food nor water.
Cockroaches will eat almost anything including glue, feces, hair, decayed leaves, paper, leather, banana skins, other cockroaches, and dead or alive humans. They will not, however, eat cucumbers. They are particularly fond of dried milk around a baby’s mouth.
The roaches are not confined to any particular environment and live in a tremendous variety of places, from underneath woodpiles in Alaska to high in the jungle canopy in the tropics of Costa Rica. They are even found in the caves of Borneo and under the thorn bushes in arid stretches of Kenya. Wherever they live, they are masters at surviving. They are, Schwied writes, “undeniably one of the pinnacles of evolution on this planet.”
Why is it so hard to kill a cockroach with your shoe?
Schweid observes that “when a cockroach feels a breeze stirring the hairs on its cerci, it does not wait around to see what is going to happen next, but leaves off whatever it is doing and goes immediately into escape mode in something remarkably close to instantaneous fashion.” Studies show that a cockroach can respond in about 1/20th of a second, so “by the time a light comes on and human sight can register it, much less react by reaching for and hoisting something with which to squash it, a roach is already locomoting towards safety.”
Cockroach blood is a pigments, clear substance circulating through the interior of its body, and what usually spurts out of a roach when its hard, , outer shell—its exoskeleton—is penetrated or squashed is a cream-colored substance resembling nothing so much as pus or smegma.
Cockroaches have two brains—one inside their skulls, and a second, more primitive brain that is back near their abdomen.
Schweid says “Pheromones, chemical signals of sexual readiness, operate between a male and female cockroach to initiate courtship and copulation. A sexually receptive female assumes a posture with her abdomen lowered and her wings raised and gives off a pheromone that attracts males.” If he finds a virgin female, a male cockroach after some antenna rubbing foreplay will turn away from the female and raise his wings, “an invitation to her to mount.”Copulation frequently lasts an hour. After sex, female cockroaches store the sperm and use them as needed. The sperm may last her a lifetime.
“The evolutionary strategy employed by cockroaches to reproduce is considerably more efficient than that employed by humans.” Oddly, there are certain species of cockroaches that can, at least for a generation or two, reproduce without any sperm. Schweid says “the females unfertilized eggs will develop and hatch—always producing new females.”
Betty Faber, the former staff entomologist for the New York Natural History Museum, says “Females go to bed—by which I mean disappear back to the harborage—at night earlier than males.”
Schweid writes, “cockroaches, while not social insects in the entomological sense of bees or ants with clearly assigned tasks that benefit the whole community, do clearly take pleasure in the company of other roaches, and the aggression pheromones draw them together, eliciting their effects regardless of the sex or age.” Cockroaches reared singly develop more slowly and take longer between molts than do those reared in a group. Although those groups can be too big “just as development is delayed in young cockroaches if they are isolated, over-crowding also extends the time between molts. So there is yet another kind of pheromone, called a “dispersal pheromone,” and it serves as the chemical signal that it is time to look for a new, slightly roomier harborage. This chemical is found in the insects’ saliva, and has just the opposite effect of the aggression attractant, in that it repulses cockroaches and causes them to look elsewhere for harborage.”
In case you’re thinking we can just nuke the little critters you should know that cockroaches survived the atomic bombs test blast at Bikini. “There is such a thing as a lethal dose of radiation for a cockroach, but it is a lot higher than our own.”
“While few humans may eat them, the roach has both external and internal predators and parasites. There are centipedes that have a primary diet of cockroaches. Mantises, ants, and scorpions will eat them, as will a variety of larger animals including toads, frogs, possums, hedgehogs, armadillos, mongooses, monkeys, lizards, spiders, mice, cats, and birds”
Roaches are nocturnal and pass their days sleeping.
“Cockroaches, like so many other species including our own, have male aggression rituals. They have their own inventory of aggressive behaviors, a scale of conflict that begins with threatening postures. Beyond that they graduate to antenna lashing—a form of which is also present in male/female encounters to determine if a female is sexually receptive–and biting. Sex and territory seem to be the primary motivations for fighting between male cockroaches: These clashes never end in death, but always in the retreat of one fighter.”
Trapping a cockroach
“Stale while bread moistened with warm, slightly soured beer” is the most reliable and effective. “This is typically placed at the bottom of a small jar—a Gerber’s baby food jar, say—around the interior rim of which a petroleum jelly like Vaseline has been applied. The cockroach can climb in from the outside but can’t climb back out.”
What should you do if you get a cockroach stuck in your ear?
“It is, according to all accounts, painful and horrifying, although a little mineral oil or lidocaine sprayed into the ear is usually enough to dislodge the intruder.”
Exterminators primarily employ two methods to kill the cockroach: gas and gel. The gel is way more effective but many still rely on the spray. Why? “The major problem that exterminators have with the gel is that it has no immediate knockdown effect.”
John Wickham, an English pest control consultant defined knockdown as: “The inability of the insect to move in a sufficiently coordinated manner to right itself and progress normally.” When a roach eats gel bait—the safer of the two methods—it heads home before the active poison kills it.
“Customers who are paying $75 an hour like to see these roaches struggling to get up, in agony and convulsions, and the sprays, with substantial knockdown effect, provide them that gratifying visual reassurance that the problem is being solved and that they are getting their money’s worth.
It’s unlikely this poison will have much long term impact. “Almost as soon as an effective poison goes into widespread use, cockroaches begin to develop Resistance. And, typically, the most efficacious products developed, those that do the best job, turn out to be more detrimental to our own health than are the roaches.”
If you want to learn more about cockroaches read The Cockroach Papers.
Reading Duncan Watts new book Everything is Obvious: Once You Know The Answer can make you uncomfortable.
Common sense is particularly well adapted to handling the complexity of everyday situations. We get intro trouble when we project our common sense to situations outside the realm of everyday life.
Applying common sense in these areas, Watts argues, “turns out to suffer from a number of errors that systematically mislead us. Yet because of the way we learn from experience—even experiences that are never repeated or that take place in other times and places—the failings of commonsense reasoning are rarely apparent to us.”
We think we have the answers but we don’t. Most real-world problems are more complex than we think. “When policy makers sit down, say, to design some scheme to alleviate poverty, they invariably rely on their own common-sense ideas about why it is that poor people are poor, and therefore how best to help them.” This is where we get into trouble. “A quick look at history,” Watts argues, “suggests that when common sense is used for purposes beyond the everyday, it can fail spectacularly.”
According to Watts, commonsense reasoning suffers from three types of errors, which reinforce one another. First, is that our mental model of the individual behaviour is systematically flawed. Second, our mental model of complex system (collective behaviour) is equally flawed. Lastly—and most interesting, in my view—is that “we learn less from history than we think we do, and that this misperception in turn skews our perception of the future.”
Whenever something interesting happens—a book by an unknown author rocketing to the top of the best-seller list, an unknown search engine increasing in value more than 100,000 times in less than 10 years, the housing bubble collapsing—we instinctively want to know why. We look for an explanation. “In this way,” Watts says, “we deceive ourselves into believing that we can make predictions that are impossible.”
“By providing ready explanations for whatever particular circumstances the world throws at us, commonsense explanations give us the confidence to navigate from day to day and relieve us of the burden of worrying about whether what we think we know is really true, or is just something we happen to believe.”
Once we know the outcome, our brains weave a clever story based on the aspects of the situation that seem relevant (at least, relevant in hindsight). We convince ourselves that we fully understand things that we don’t.
Is Netflix successful, as Reed Hastings argues, because of their culture? Which aspects of their culture make them successful? Do companies with a similar culture exist that fail? “The paradox of common sense, then, is that even as it helps us make sense of the world, it can actively undermine our ability to understand it.”
The key to improving your ability to make decisions then is to figure out what kind of predictions can we make and how we can improve our accuracy.
One problem with making predictions is knowing what variables to look at and how to weigh them. Even if we get the variables and relative importance of one factor to another correct, these predictions also reflect how much the future will resemble the past. As Warren Buffett says “the rearview mirror is always clearer than the windshield.”
Relying on historical data is problematic because of the frequency of big strategic decisions. “If you could make millions, or even hundreds, such bets,” Watts argues, “it would make sense to got with the historical probability. But when facing a decisions about whether or not to lead the country into war, or to make some strategic acquisition, you cannot count on getting more than one attempt. … making one-off strategic decisions is therefore ill suited to statistical models or crowd wisdom.”
Watts finds it ironic that organizations using the best practices in strategy planning can also be the most vulnerable to planning errors. This is the strategy paradox.
Michael Raynor, author of The Strategy Paradox, argues that the main cause of strategic failure is not bad strategy but great strategy that happens to be wrong. Bad strategy is characterized by lack of vision, muddled leadership, and inept execution, which is more likely to lead to mediocrity than colossal failure. Great strategy, on the other hand, is marked by clarity of vision, bold leadership, and laser-focused execution. Great strategy can lead to great successes as it did with the iPod but it can also lead to enormous failures as it did with Betamax. “Whether great strategy succeeds or fails therefore depends entirely on whether the initial vision happens to be right or not. And that is not just difficult to know in advance, but impossible.” Raynor argues that the solution to this is to develop methods for planning that account for strategic uncertainty. (I’ll eventually get around to reviewing the Strategy Paradox—It was a great read.)
Rather than trying to predict an impossible future, another idea is to react to changing circumstances as rapidly as possible, dropping alternatives that are not working no matter how promising they seem and diverting resources to those that are succeeding. This sounds an awful lot like evolution (variation and selection).
Watts and Raynor’s solution to overcome our inability to predict the future echos Peter Palchinsky’s principles. The Palchinsky Principles, as said by Tim Harford in Adapt (review) are “first, seek out new ideas and try new things; second, when trying something new do it on a scale where failure is survivable; third, seek out feedback and learn from your mistakes as you go along.”
Of course this experimental approach has limits. The US can’t go to war with half of Iraq with one strategy and the other half with a different approach to see which one works best. Watts says “for decisions like these, it’s unlikely that an experimental approach will be of much help.”
In the end, Watts concludes that planners need to learn to behave more “like what the development economist William Easterly calls searchers.” As Easterly put it:
A Planner thinks he already knows the answer; he thinks of poverty as a technical engineering problem that his answers will solve. A Searcher admits he doesn’t know the answers in advance; he believes that poverty is a complicated tangle of political, social, historical, institutional, and technological factors…and hopes to find answers to individual problems by trial and error…A Planner believes outsiders know enough to impose solutions. A Searcher believes only insiders have enough knowledge to find solutions, and that most solutions must be homegrown.
|Still curious? Read Everything is Obvious: Once You Know The Answer.|
Hindsight bias occurs when we look backward in time and see events are more predictable than they were at the time a decision was made. This bias, also known as the “knew-it-all-along effect,” typically involves those annoying “I told you so” people who never really told you anything.
For instance, consider driving in the car with your partner and coming to a T in the road. Your partner decides to turn right and 4 miles down the road when you realize you are lost you think “I knew we should have taken that left.”
Hindsight bias can offer a number of benefits in the short run. For instance, it can be flattering to believe that our judgment is better than it actually is. And, of course, hindsight bias allows us to participate in one of our favorite pastimes — criticizing the decisions of others for their lack of foresight.
“Judgments about what is good and what is bad, what is worthwhile and what is a waste of talent, what is useful and what is less so, are judgments that seldom can be made in the present. They can safely be made only by posterity.”
Aside from helping aid in a more objective reflection of decisions, hindsight bias also has several practical implications. For example, consider someone asked to review a paper but knows the results of the previous review from someone else? Or a physician asked for a second opinion after knowing the results of the first. The results of these actions will likely be biased by some degree. Once we know an outcome it becomes easy to find some plausible explanation.
Hindsight bias helps us become less accountable for our decisions, less critical of ourselves, and over-confident in our ability to make decisions.
One of the most interesting things I discovered when researching hindsight bias was the impact on our legal system and the perceptions of jurors.
Harvard Professor Max Bazerman offers:
The processes that give rise to anchoring and overconfidence are also at play with the hindsight bias. According to this explanation, knowledge of an event’s outcome works as an anchor by which individuals interpret their prior judgments of the event’s likelihood. Due to the selective accessibility of the confirmatory information during information retrieval, adjustments to anchors are inadequate. Consequently, hindsight knowledge biases our perceptions of what we remember knowing in foresight. Furthermore, to the extent that various pieces of data about the event vary in support of actual outcome, evidence that is consistent with the known outcome may become cognitively more salient and thus more available in memory. This tendency will lead an individual to justify a claimed foresight in view of “the facts provided.” Finally, the relevance of a particular piece of that may later be judged important to the extent to which it is representative of the final observed outcome.
In Cognitive Illusions, Rudiger Pohl offered the following explanations of hindsight bias:
Most prominent among the proposes explanations are cognitive accounts which assume that hindsight bias results from an inability to ignore the solution. Among the early approaches are the following three: (1) Fischhoff (1975) assumed an immediate and irreversible assimilation of the solution into one’s knowledge base. As a consequence, the reconstructed estimate will be biased towards the solution. (2) Tversky and Kahneman (1974) proposed a cognitive heuristic for the anchoring effected, named anchoring and insufficient adjustment. The same mechanism may apply here, if the solution is assumed to serve as an “anchor” in the reconstruction process. The reconstruction starts from this anchor and is then adjusted in the direction of one’s knowledge base. However, this adjustment process may stop too early, for example at the point where the first plausible value is reached, thus ending to a biased reconstruction. (3) Hell (1988) argued that the relative trace strengths of the regional estimate and of the solution might predict the amount of hindsight bias. The stronger the trace strength of the solution relative to that of the original estimate, the larger hindsight bias should be.
Pohl also offers an evolutionary explanation of hindsight bias:
Finally, some authors argued that hindsight bias is not necessarily a bothersome consequence of a “faulty” information process system, but that is may rather represent an unavoidable by-product of an evolutionary evolved function, namely adaptive learning. According to this view, hindsight bias is seen as the consequence of our most valuable ability to update previously held knowledge. This may be seen as a necessary process in order to prevent memory overload and thus to maintain normal cognitive functioning. Besides, updating allows us to keep our knowledge more coherent and to draw better inferences.
Ziva Junda, in social cognition, offers the following explanation of why hindsight bias occurs:
Preceding events take on new meaning and importance as they are made to cohere with the known outcome. Now that we know that our friends have filed for divorce, any ambiguous behavior we have seen is reinterpreted as indicative of tension, any disagreement gains significance, and any signs of affection seem irrelevant. It now seems obvious that their marriage was doomed from the start…Moreover, having adjusted our interpretations in light of current knowledge, it is difficult to imagine how things could have happened differently.
When making likelihood judgments, we often rely on the availability heuristic: The more difficult it is for us to imagine an outcome, the more unlikely it seems. Therefore, the difficulty we experience imagining how things might have turned out differently makes us all the more convinced that the outcomes that did occur were bound to have occurred.
Hindsight bias has large implications for criminal trials. In Jury Selection Hale Starr and Mark McCormick offer the following:
The effects of hindsight bias – which result in being held to a higher standard – are most critical for both criminal and civil defendants. The defense is more susceptible to the hindsight bias since their actions are generally the ones being evaluated fro reasonableness in foresight-foreseeability. When jurors perceive that the results of particular actions were “reasonably” more likely after the outcome is known, defendants are judged as having been capable of knowing more than they knew at the time the action was taken and therefore as capable of preventing the “bad” outcome.
In post-verdict surveys jurors unknowingly demonstrate some of the effects of hindsight bias:
“I can’t understand why the managers didn’t try to get more information or use the information they had available. They should have known there would be safety problems at the plant”.
“The defendants should have known people would remove the safety shield around the tire. There should have been warnings so people wouldn’t do that”
“Even though he was a kid, he should have known that once he showed the others who had been drinking that he had a gun, things would get out of hand. He should have known guns invited violence.
Jurors influenced by the hindsight bias look at the evidence presented and determine that the defendants knew or should have known their actions were unsafe, unwise, or created a dangerous situation. Hindsight bias often results in the judgment that the event was “an accident or tragedy waiting to happen.”
Protection Against Hindsight Bias
In Principles of Forecasting, Jon Scott Armstrong, offers the following advice on how to protect yourself:
The surest protection against (hindsight bias) is disciplining ourselves to make explicit predictions, showing what we did in fact know (sounds like a decision journal). That record can also provide us with some protection against those individuals who are wont to second guess us, producing exaggerated claims of what we should have known (and perhaps should have told them). If these observers look to this record, it may show them that we are generally less proficient as forecaster than they would like while protecting us against charges of having blown a particular assignment. Having an explicit record can also protect us against overconfidence in our own forecasting ability: If we feel that we “knew all along” what was going to happen, then it is natural enough to think that we will have similar success in the future. Unfortunately, an exaggerated perception of a surprise-free past maybe portend a surprised-full future.
Documenting the reasons we made a forecast makes it possible for us to know not only how well the forecast did, but also where it went astray. For example, subsequent experiences may show that we used wrong (or misunderstood) inputs. In that case, we can, in principle, rerun the forecasting process with better inputs and assess the accuracy of our (retrospectively) revised forecasts. Perhaps we did have the right theory and procedures, but were applying them to a mistaken picture of then-current conditions…Of course inputs are also subject to hindsight bias, hence we need to record them explicitly as well. The essence of making sense out of outcome knowledge is reinterpreting the processes and conditions that produced the reported event.
Hindsight Bias is part of the Farnam Street latticework of mental models.