When we try to make a single change within a complex system, we often end up causing unintended consequences. These can be positive or negative. If we don’t anticipate unintended consequences, we can’t expect to achieve our desired outcomes.
In 1890, a New Yorker named Eugene Schieffelin took his intense love of Shakespeare’s Henry VI to the next level.
Most Shakespeare fanatics channel their interest by going to see performances of the plays, meticulously analyzing them, or reading everything they can about the playwright’s life. Schieffelin wanted more; he wanted to look out his window and see the same kind of birds in the sky that Shakespeare had seen.
Inspired by a mention of starlings in Henry VI, Schieffelin released 100 of the non-native birds in Central Park over two years. (He wasn’t acting alone – he had the support of scientists and the American Acclimatization Society.) We can imagine him watching the starlings flutter off into the park and hoping for them to survive and maybe breed. Which they did. In fact, the birds didn’t just survive; they thrived and bred like weeds.
Unfortunately, Schieffelin’s plan worked too well. Far, far too well. The starlings multiplied exponentially, spreading across America at an astonishing rate. Today, we don’t even know how many of them live in the U.S., with official estimates ranging from 45 million to 200 million. Most, if not all, of them are descended from Schieffelin’s initial 100 birds. The problem is that as an alien species, the starlings wreak havoc because they were introduced into an ecosystem they were not naturally part of and the local species had (and still have) no defense against them.
If you live in an area with a starling population, you are doubtless familiar with the hardy, fearless nature of these birds. They gather in enormous flocks, destroying crops, snatching food supplies from native birds, and scavenging in cities. Starlings now consume millions of dollars’ worth of crops each year and cause fatal airplane crashes. Starlings also spread diseases, including e. coli infections and salmonella.
“When we try to pick out anything by itself, we find it hitched to everything else in the universe”
— John Muir
Schieffelin’s starlings are a prime example of unintended consequences. In Best Laid Plans: The Tyranny of Unintended Consequences and How to Avoid Them, William A. Sherden writes:
Sometimes unintended consequences are catastrophic, sometimes beneficial. Occasionally their impacts are imperceptible, at other times colossal. Large events frequently have a number of unintended consequences, but even small events can trigger them. There are numerous instances of purposeful deeds completely backfiring, causing the exact opposite of what was intended.
We all know that our actions and decisions can have surprising reverberations that have no relation to our initial intentions. This is why second-order thinking is so crucial. Sometimes we can open a Pandora’s box or kick a hornet’s nest without realizing it. In a dynamic world, you can never do merely one thing.
Unintended consequences arise because of the chaotic nature of systems. When Schieffelin released the starlings, he did not know the minutiae of the ecological and social systems they would be entering. As the world becomes more complicated and interconnected, the potential for ever more serious unintended consequences grows.
All too often when we mess with complicated systems, we have no more control over the outcomes than we would if we performed shamanistic dances. The simple fact is that we cannot predict how a system will behave through mathematical models or computer simulations or basic concepts like cause and effect or supply and demand.
In The Gene: An Intimate History, Siddhartha Mukherjee writes that unintended consequences can be the result of scientists failing to appreciate the complexity of systems:
The parables of such scientific overreach are well-known: foreign animals, introduced to control pests, become pests in their own right; the raising of smokestacks, meant to alleviate urban pollution, releases particulate effluents higher in the air and exacerbates pollution; stimulating blood formation, meant to prevent heart attacks, thickens the blood and results in an increased risk of blood clots to the heart.
Mukherjee notes that unintended consequences can also be the result of people thinking that something is more complex than it actually is:
… when nonscientists overestimate complexity-“No one can possibly crack this code”-they fall into the trap of unanticipated consequences. In the early 1950s, a common trope among some biologists was that the genetic code would be so context dependent-so utterly determined by a particular cell in a particular organism and so horribly convoluted-that deciphering it would be impossible. The truth turned out to be quite the opposite: just one molecule carries the code, and just one code pervades the biological world. If we know the code, we can intentionally alter it in organisms, and ultimately in humans.
As was mentioned in the quote from Sherden above, sometimes perverse unintended consequences occur when actions have the opposite of the desired effect. From The Nature of Change or the Law of Unintended Consequences by John Mansfield:
An example of the unexpected results of change is found in the clearing of trees to make available more agricultural land. This practice has led to rising water tables and increasing salinity that eventually reduces the amount of useable land.
Some additional examples:
- Suspending problematic children from school worsens their behavior, as they are more likely to engage in criminal behavior when outside school.
- Damage-control lawsuits can lead to negative media attention and cause more harm (as occurred in the notorious McLibel case).
- Banning alcohol has, time and time again, led to higher consumption and the formation of criminal gangs, resulting in violent deaths.
- Abstinence-based education invariably causes a rise in teenage pregnancies.
- Many people who experience a rodent infestation will stop feeding their cats, assuming that this will encourage them to hunt more. The opposite occurs: well-fed cats are better hunters than hungry ones.
- When the British government offered financial rewards for people who killed and turned in cobras in India, people, reacting to incentives, began breeding the snakes. Once the reward program was scrapped, the population of cobras in India rose as people released the ones they had raised. The same thing occurred in Vietnam with rats.
This phenomenon, of the outcome being the opposite of the intended one, is known as “blowback” or the Cobra effect, for obvious reasons. Just as with iatrogenics, interventions often lead to worse problems.
Sometimes the consequences are mixed and take a long time to appear, as with the famous Leaning Tower of Pisa. From The Nature of Change again:
When the tower was built, it was undoubtedly intended to stand vertical. It took about 200 years to complete, but by the time the third floor was added, the poor foundations and loose subsoil had allowed it to sink on one side. Subsequent builders tried to correct this lean and the foundations have been stabilised by 20th-century engineering, but at the present time, the top of the tower is still about 15 feet (4.5 meters) from the perpendicular. Along with the unexpected failure of the foundations is the unexpected consequence of the Leaning Tower of Pisa becoming a popular tourist attraction, bringing enormous revenue to the town.
It’s important to note that unintended consequences can sometimes be positive. Someone might have a child because they think parenthood will be a fulfilling experience. If their child grows up and invents a drug that saves thousands of lives, that consequence is positive yet unplanned. Pokemon Go, strange as it seemed, encouraged players to get more exercise. The creation of No Man’s Lands during conflicts can preserve the habitats of local wildlife, as has occurred around the Berlin Wall. Sunken ships form coral reefs where wildlife thrives. Typically, though, when we talk about the law of unintended consequences, we’re talking about negative consequences.
“Any endeavor has unintended consequences. Any ill-conceived endeavor has more.”
— Stephen Tobolowsky, The Dangerous Animals Club
The Causes of Unintended Consequences
By their nature, unintended consequences can be a mystery. I’m not a fan of the term “unintended consequences,” though, as it’s too often a scapegoat for poor thinking. There are always consequences, whether you see them or not.
When we reflect on the roots of consequences that we failed to see but could have, we are liable to build a narrative that packages a series of chaotic events into a neat chain of cause and effect. A chain that means we don’t have to reflect on our decisions to see where we went wrong. A chain that keeps our egos intact.
Sociologist Robert K. Merton has identified five potential causes of consequences we failed to see:
- Our ignorance of the precise manner in which systems work.
- Analytical errors or a failure to use Bayesian updating (not updating our beliefs in light of new information).
- Focusing on short-term gain while forgetting long-term consequences.
- The requirement for or prohibition of certain actions, despite the potential long-term results.
- The creation of self-defeating prophecies (for example, due to worry about inflation, a central bank announces that it will take drastic action, thereby accidentally causing crippling deflation amidst the panic).
Most unintended consequences are just unanticipated consequences.
Using logical fallacies and mental models, and keeping Schieffelin’s starlings in mind, we can identify several more possible causes of consequences that we likely should have seen in advance but didn’t. Here they are:
Over-reliance on models and predictions—mistaking the map for the territory. Schieffelin could have made a predictive model of how his starlings would breed and would affect their new habitat. The issue is that models are not gospel and the outcomes they predict do not represent the real world. All models are wrong, but that doesn’t mean they’re not useful sometimes. You have to understand the model and the terrain it’s based on. Schieffelin’s predictive model might have told him that the starlings’ breeding habits would have a very minor impact on their new habitat. But in reality, the factors involved were too diverse and complex to take into account. Schieffelin’s starlings bred faster and interacted with their new environment in ways that would be hard to predict. We can assume that he based his estimations of the future of the starlings on their behavior in their native countries.
Survivorship bias. Unintended consequences can also occur when we fail to take into account all of the available information. When predicting an outcome, we have an inherent tendency to search for other instances in which the desired result occurred. Nowadays, when anyone considers introducing a species to a new area, they are likely to hear about Schieffelin’s starlings. And Schieffelin was likely influenced by stories about, perhaps even personal experiences with, successfully introducing birds into new habitats, unaware of the many ecosystem-tampering experiments that had gone horribly wrong.
The compounding effect of consequences. Unintended results do not progress in a linear manner. Just as untouched money in a savings account compounds, the population of Schieffelin’s starlings compounded over the following decades. Each new bird that was hatched meant more hatchlings in future generations. At some point, the bird populations reached critical mass and no attempts to check their growth could be successful. As people in one area shot or poisoned the starlings, the breeding of those elsewhere continued.
Denial. Just as we seek out confirmatory evidence, we are inclined to deny the existence of disconfirming information. We may be in denial about the true implications of actions. Governments, in particular, tend to focus on the positive consequences of legislation while ignoring the costs. Negative unintended consequences do not always result in changes being made. Open-plan offices are another instance; they were first designed to encourage collaboration and creativity. Even though research has shown that they have the opposite effect, many companies continue to opt for open offices. They sound like a good idea, and airy offices with beanbags and pot plants might look nice, but those who continue building them are in obvious denial.
Failure to account for base rates. When we neglect to consider how the past will affect the future, we are failing to account for base rates. Schieffelin likely failed to consider the base rates of successful species introduction.
Curiosity. We sometimes perform actions out of curiosity, without any idea of the potential consequences. The problem is that our curiosity can lead us to behave in reckless, unplanned, or poorly thought-through ways. The release of Schieffelin’s starlings was in part the result of widespread curiosity about the potential for introducing European species to America.
The tendency to want to do something. We are all biased towards action. We don’t want to sit around — we want to act and make changes. The problem is that sometimes doing nothing is the best route to take. In the case of Schieffelin’s starlings, he was biased towards making alterations to the wildlife around him to bring Shakespeare’s world to life, even though leaving nature alone is usually preferable.
Mental Models for Avoiding or Minimizing Unintended Consequences
We cannot eliminate unintended consequences, but we can become more aware of them through rational thinking techniques. In this section, we will examine some ways of working with and understanding the unexpected. Note that the examples provided here are simplifications of complex issues. The observations made about them are those of armchair critics, not those involved in the actual decision making.
Inversion. When we invert our thinking, we consider what we want to avoid, not what we want to cause. Rather than seeking perfection, we should avoid stupidity. By considering potential unintended consequences, we can then work backward. For example, the implementation of laws which required cyclists to wear helmets at all times led to a rise in fatalities. (People who feel safer behave in a more risky manner.) If we use inversion, we know we do not want any change in road safety laws to cause more injuries or deaths. So, we could consider creating stricter laws surrounding risky cycling and enforcing penalties for those who fail to follow them.
Another example is laws which aim to protect endangered animals by preventing new developments on land where rare species live. Imagine that you are a landowner, about to close a lucrative deal. You look out at your land and notice a smattering of endangered wildflowers. Do you cancel the sale and leave the land to the flowers? Of course not. Unless you are exceptionally honest, you grab a spade, dig up the flowers, and keep them a secret. Many people shoot, poison, remove, or otherwise harm endangered animals and plants. If lawmakers used inversion, they would recognize that they want to avoid those consequences and work backward.
We have to focus on avoiding the worst unintended consequences, rather than on controlling everything.
Looking for disconfirming evidence. Instead of looking for information that confirms that our actions will have the desired consequences, we should rigorously search for evidence that they will not. How did this go in the past? Take the example of laws regarding the minimum wage and worker rights. Every country has people pushing for a higher minimum wage and for more protection of workers. If we search for disconfirming evidence, we see that these laws can do more harm than good. The French appear to have perfected labor laws. All employees are, on the face of it, blessed with a minimum wage of 17,764 euros per year, a 35-hour work week, five weeks paid holiday, and strict protection against redundancy (layoffs). So, why don’t we all just move to France? Because these measures result in a lot of negative unintended consequences. Unemployment rates are high, as many businesses cannot afford to hire many employees. Foreign companies are reluctant to hire French workers, as they can’t fire them during tough economic times. Everyone deserves a fair minimum wage and protection from abuse of their rights, but France illustrates how taking this principle too far can have negative unintended consequences.
Understanding our circle of competence. Each of us has areas we understand well and are familiar with. When we act outside our circle of competence, we increase the risk of unintended consequences. If you decide to fix your boiler without consulting a plumber, you are acting outside of your circle of competence and have a good chance of making the problem worse. When the British government implemented bounties for dead cobras in India, their circle of competence did not include an understanding of the locals. Perhaps if they had consulted some Indian people and asked how they would react to such a law, they could have avoided causing a rise in the cobra population.
Second-order thinking. We often forget that our actions can have two layers of consequences, of which the first might be intended and the second unintended. With Schieffelin’s starlings, the first layer of consequences was positive and as intended. The birds survived and bred, and Shakespeare fans living in New York got to feel a bit closer to the iconic playwright. But the negative second layer of consequences dwarfed the first layer. For the parents of a child who grows up to invent a life-saving drug, the first layer of consequences is that those parents (presumably) have a fulfilling experience. The second layer of consequences is that lives are saved. When we use second-order thinking, we ask: what could happen? What if the opposite of what I expect happens? What might the results be a year, five years, or a decade from now?
Most unintended consequences are just unanticipated consequences. And in the world of consequences intentions often don’t matter. Intentions, after all, only apply to positive anticipated consequences. Only in rare circumstances would someone intend to cause negative consequences.
So when we make decisions we must ask what the consequences be? This is where having a toolbox of mental models becomes helpful.