Tag: Robert K. Merton

The Pygmalion Effect: Proving Them Right

The Pygmalion Effect is a powerful secret weapon. Without even realizing it, we can nudge others towards success. In this article, discover how expectations can influence performance for better or worse.

How Expectations Influence Performance

Many people believe that their pets or children are of unusual intelligence or can understand everything they say. Some people have stories of abnormal feats. In the late 19th century, one man claimed that about his horse and appeared to have evidence. William Von Osten was a teacher and horse trainer. He believed that animals could learn to read or count. Von Osten’s initial attempts with dogs and a bear were unsuccessful, but when he began working with an unusual horse, he changed our understanding of psychology. Known as Clever Hans, the animal could answer questions, with 90% accuracy, by tapping his hoof. He could add, subtract, multiply, divide, and tell the time and the date.

Clever Hans could also read and understand questions written or asked in German. Crowds flocked to see the horse, and the scientific community soon grew interested. Researchers studied the horse, looking for signs of trickery. Yet they found none. The horse could answer questions asked by anyone, even if Von Osten was absent. This indicated that no signaling was at play. For a while, the world believed the horse was truly clever.

Then psychologist Oskar Pfungst turned his attention to Clever Hans. Assisted by a team of researchers, he uncovered two anomalies. When blinkered or behind a screen, the horse could not answer questions. Likewise, he could respond only if the questioner knew the answer. From these observations, Pfungst deduced that Clever Hans was not making any mental calculations. Nor did he understand numbers or language in the human sense. Although Von Osten had intended no trickery, the act was false.

Instead, Clever Hans had learned to detect subtle, yet consistent nonverbal cues. When someone asked a question, Clever Hans responded to their body language with a degree of accuracy many poker players would envy. For example, when someone asked Clever Hans to make a calculation, he would begin tapping his hoof. Once he reached the correct answer, the questioner would show involuntary signs. Pfungst found that many people tilted their head at this point. Clever Hans would recognize this behavior and stop. When blinkered or when the questioner did not know the answer, the horse didn’t have a clue. When he couldn’t see the cues, he had no answer.

The Pygmalion Effect

Von Osten died in 1909 and Clever Hans disappeared from record. But his legacy lives on in a particular branch of psychology.

The case of Clever Hans is of less interest than the research it went on to provoke. Psychologists working in the decades following began to study how the expectations of others affect us. If someone expected Clever Hans to answer a question and ensured that he knew it, could the same thing occur elsewhere?

Could we be, at times, responding to subtle cues? Decades of research have provided consistent, robust evidence that the answer is yes. It comes down to the concepts of the self-fulfilling prophecy and the Pygmalion effect.

The Pygmalion effect is a psychological phenomenon wherein high expectations lead to improved performance in a given area. Its name comes from the story of Pygmalion, a mythical Greek sculptor. Pygmalion carved a statue of a woman and then became enamored with it. Unable to love a human, Pygmalion appealed to Aphrodite, the goddess of love. She took pity and brought the statue to life. The couple married and went on to have a daughter, Paphos.

False Beliefs Come True Over Time

In the same way Pygmalion’s fixation on the statue brought it to life, our focus on a belief or assumption can do the same. The flipside is the Golem effect, wherein low expectations lead to decreased performance. Both effects come under the category of self-fulfilling prophecies. Whether the expectation comes from us or others, the effect manifests in the same way.

The Pygmalion effect has profound ramifications in schools and organizations and with regard to social class and stereotypes. By some estimations, it is the result of our brains’ poorly distinguishing between perception and expectation. Although many people purport to want to prove their critics wrong, we often merely end up proving our supporters right.

Understanding the Pygmalion effect is a powerful way to positively affect those around us, from our children and friends to employees and leaders. If we don’t take into account the ramifications of our expectations, we may miss out on the dramatic benefits of holding high standards.

The concept of a self-fulfilling prophecy is attributed to sociologist Robert K. Merton. In 1948, Merton published the first paper on the topic. In it, he described the phenomenon as a false belief that becomes true over time. Once this occurs, it creates a feedback loop. We assume we were always correct because it seems so in hindsight. Merton described a self-fulfilling prophecy as self-hypnosis through our own propaganda.

As with many psychological concepts, people had a vague awareness of its existence long before research confirmed anything. Renowned orator and theologian Jacques Benigne Bossuet declared in the 17th century that “The greatest weakness of all weaknesses is to fear too much to appear weak.”

Even Sigmund Freud was aware of self-fulfilling prophecies. In A Childhood Memory of Goethe, Freud wrote: “If a man has been his mother’s undisputed darling he retains throughout life the triumphant feeling, the confidence in success, which not seldom brings actual success with it.”

The IQ of Students

Research by Robert Rosenthal and Lenore Jacobson examined the influence of teachers’ expectations on students’ performance. Their subsequent paper is one of the most cited and discussed psychological studies ever conducted.

Rosenthal and Jacobson began by testing the IQ of elementary school students. Teachers were told that the IQ test showed around one-fifth of their students to be unusually intelligent. For ethical reasons, they did not label an alternate group as unintelligent and instead used unlabeled classmates as the control group. It will doubtless come as no surprise that the “gifted” students were chosen at random. They should not have had a significant statistical advantage over their peers. As the study period ended, all students had their IQs retested. Both groups showed an improvement. Yet those who were described as intelligent experienced much greater gains in their IQ points. Rosenthal and Jacobson attributed this result to the Pygmalion effect. Teachers paid more attention to “gifted” students, offering more support and encouragement than they would otherwise. Picked at random, those children ended up excelling. Sadly, no follow-up studies were ever conducted, so we do not know the long-term impact on the children involved.

Prior to studying the effect on children, Rosenthal performed preliminary research on animals. Students were given rats from two groups, one described as “maze dull” and the other as “maze bright.” Researchers claimed that the former group could not learn to properly negotiate a maze, but the latter could with ease. As you might expect, the groups of rats were the same. Like the gifted and nongifted children, they were chosen at random. Yet by the time the study finished, the “maze-bright” rats appeared to have learned faster. The students considered them tamer and more pleasant to work with than the “maze-dull” rats.

In general, authority figures have the power to influence how the people subordinate to them behave by holding high expectations. Whether consciously or not, leaders facilitate changes in behavior, such as by giving people more responsibility or setting stretch goals. Like the subtle cues that allowed Clever Hans to make calculations, these small changes in treatment can promote learning and growth. If a leader thinks an employee is competent, they will treat them as such. The employee then gets more opportunities to develop their competence, and their performance improves in a positive feedback loop. This works both ways. When we expect an authority figure to be competent or successful, we tend to be attentive and supportive. In the process, we bolster their performance, too. Students who act interested in lectures create interesting lecturers.

In Pygmalion in Management, J. Sterling Livingston writes,

Some managers always treat their subordinates in a way that leads to superior performance. But most … unintentionally treat their subordinates in a way that leads to lower performance than they are capable of achieving. The way managers treat their subordinates is subtly influenced by what they expect of them. If manager’s expectations are high, productivity is likely to be excellent. If their expectations are low, productivity is likely to be poor. It is as though there were a law that caused subordinates’ performance to rise or fall to meet managers’ expectations.

The Pygmalion effect shows us that our reality is negotiable and can be manipulated by others — on purpose or by accident. What we achieve, how we think, how we act, and how we perceive our capabilities can be influenced by the expectations of those around us. Those expectations may be the result of biased or irrational thinking, but they have the power to affect us and change what happens. While cognitive biases distort only what we perceive, self-fulfilling prophecies alter what happens.

Of course, the Pygmalion effect works only when we are physically capable of achieving what is expected of us. After Rosenthal and Jacobson published their initial research, many people were entranced by the implication that we are all capable of more than we think. Although that can be true, we have no indication that any of us can do anything if someone believes we can. Instead, the Pygmalion effect seems to involve us leveraging our full capabilities and avoiding the obstacles created by low expectations.

Clever Hans truly was an intelligent horse, but he was smart because he could read almost imperceptible nonverbal cues, not because he could do math. So, he did have unusual capabilities, as shown by the fact that few other animals have done what he did.

We can’t do anything just because someone expects us to. Overly high expectations can also be stressful. When someone sets the bar too high, we can get discouraged and not even bother trying. Stretch goals and high expectations are beneficial, up to the point of diminishing returns. Research by McClelland and Atkinson indicates that the Pygmalion effect drops off if we see our chance of success as being less than 50%. If an endeavor seems either certain or completely uncertain, the Pygmalion effect does not hold. When we are stretched but confident, high expectations can help us achieve more.

Check Your Assumptions

In Self-Fulfilling Prophecy: A Practical Guide to Its Use in Education, Robert T. Tauber describes an exercise in which people are asked to list their assumptions about people with certain descriptions. These included a cheerleader, “a minority woman with four kids at the market using food stamps,” and a “person standing outside smoking on a cold February day.” An anonymous survey of undergraduate students revealed mostly negative assumptions. Tauber asks the reader to consider how being exposed to these types of assumptions might affect someone’s day-to-day life.

The expectations people have of us affect us in countless subtle ways each day. Although we rarely notice it (unless we are on the receiving end of overt racism, sexism, and other forms of bias), those expectations dictate the opportunities we are offered, how we are spoken to, and the praise and criticism we receive. Individually, these knocks and nudges have minimal impact. In the long run, they might dictate whether we succeed or fail or fall somewhere on the spectrum in between.

The important point to note about the Pygmalion effect is that it creates a literal change in what occurs. There is nothing mystical about the effect. When we expect someone to perform well in any capacity, we treat them in a different way. Teachers tend to show more positive body language towards students they expect to be gifted. They may teach them more challenging material, offer more chances to ask questions, and provide personalized feedback. As Carl Sagan declared, “The visions we offer our children shape the future. It matters what those visions are. Often they become self-fulfilling prophecies. Dreams are maps.”

A perfect illustration is the case of James Sweeney and George Johnson, as described in Pygmalion in Management. Sweeney was a teacher at Tulane University, where Johnson worked as a porter. Aware of the Pygmalion effect, Sweeney had a hunch that he could teach anyone to be a competent computer operator. He began his experiment, offering Johnson lessons each afternoon. Other university staff were dubious, especially as Johnson appeared to have a low IQ. But the Pygmalion effect won out and the former janitor eventually became responsible for training new computer operators.

The Pygmalion effect is a powerful secret weapon. Who wouldn’t want to help their children get smarter, help employees and leaders be more competent, and generally push others to do well? That’s possible if we raise our standards and see others in the best possible light. It is not necessary to actively attempt to intervene. Without even realizing it, we can nudge others towards success. If that sounds too good to be true, remember that the effect holds up for everything from rats to CEOs.

Members of our Learning Community can discuss this article here.

The Law of Unintended Consequences: Shakespeare, Cobra Breeding, and a Tower in Pisa

“When we try to pick out anything by itself, we find it hitched to everything else in the universe”

— John Muir

In 1890, a New Yorker named Eugene Schieffelin took his intense love of Shakespeare’s Henry VI to the next level.

Most Shakespeare fanatics channel their interest by going to see performances of the plays, meticulously analyzing them, or reading everything they can about the playwright’s life. Schieffelin wanted more; he wanted to look out his window and see the same kind of birds in the sky that Shakespeare had seen.

Inspired by a mention of starlings in Henry VI, Schieffelin released 100 of the non-native birds in Central Park over two years. (He wasn’t acting alone – he had the support of scientists and the American Acclimatization Society.) We can imagine him watching the starlings flutter off into the park and hoping for them to survive and maybe breed. Which they did. In fact, the birds didn’t just survive; they thrived and bred like weeds.

Unfortunately, Schieffelin’s plan worked too well. Far, far too well. The starlings multiplied exponentially, spreading across America at an astonishing rate. Today, we don’t even know how many of them live in the U.S., with official estimates ranging from 45 million to 200 million. Most, if not all, of them are descended from Schieffelin’s initial 100 birds. The problem is that as an alien species, the starlings wreak havoc because they were introduced into an ecosystem they were not naturally part of and the local species had (and still have) no defense against them.

If you live in an area with a starling population, you are doubtless familiar with the hardy, fearless nature of these birds. They gather in enormous flocks, destroying crops, snatching food supplies from native birds, and scavenging in cities. Starlings now consume millions of dollars’ worth of crops each year and cause fatal airplane crashes. Starlings also spread diseases, including e. coli infections and salmonella.

Schieffelin’s starlings are a prime example of unintended consequences. In Best Laid Plans: The Tyranny of Unintended Consequences and How to Avoid Them, William A. Sherden writes:

Sometimes unintended consequences are catastrophic, sometimes beneficial. Occasionally their impacts are imperceptible, at other times colossal. Large events frequently have a number of unintended consequences, but even small events can trigger them. There are numerous instances of purposeful deeds completely backfiring, causing the exact opposite of what was intended.

We all know that our actions and decisions can have surprising reverberations that have no relation to our initial intentions. This is why second-order thinking is so crucial. Sometimes we can open a Pandora’s box or kick a hornet’s nest without realizing it. In a dynamic world, you can never do merely one thing.

Unintended consequences arise because of the chaotic nature of systems. When Schieffelin released the starlings, he did not know the minutiae of the ecological and social systems they would be entering. As the world becomes more complicated and interconnected, the potential for ever more serious unintended consequences grows.

All too often when we mess with complicated systems, we have no more control over the outcomes than we would if we performed shamanistic dances. The simple fact is that we cannot predict how a system will behave through mathematical models or computer simulations or basic concepts like cause and effect or supply and demand.

In The Gene: An Intimate History, Siddhartha Mukherjee writes that unintended consequences can be the result of scientists failing to appreciate the complexity of systems:

The parables of such scientific overreach are well-known: foreign animals, introduced to control pests, become pests in their own right; the raising of smokestacks, meant to alleviate urban pollution, releases particulate effluents higher in the air and exacerbates pollution; stimulating blood formation, meant to prevent heart attacks, thickens the blood and results in an increased risk of blood clots to the heart.

Mukherjee notes that unintended consequences can also be the result of people thinking that something is more complex than it actually is:

… when nonscientists overestimate complexity-“No one can possibly crack this code”-they fall into the trap of unanticipated consequences. In the early 1950s, a common trope among some biologists was that the genetic code would be so context dependent-so utterly determined by a particular cell in a particular organism and so horribly convoluted-that deciphering it would be impossible. The truth turned out to be quite the opposite: just one molecule carries the code, and just one code pervades the biological world. If we know the code, we can intentionally alter it in organisms, and ultimately in humans.

As was mentioned in the quote from Sherden above, sometimes perverse unintended consequences occur when actions have the opposite of the desired effect. From The Nature of Change or the Law of Unintended Consequences by John Mansfield:

An example of the unexpected results of change is found in the clearing of trees to make available more agricultural land. This practice has led to rising water tables and increasing salinity that eventually reduces the amount of useable land.

Some additional examples:

  • Suspending problematic children from school worsens their behavior, as they are more likely to engage in criminal behavior when outside school.
  • Damage-control lawsuits can lead to negative media attention and cause more harm (as occurred in the notorious McLibel case).
  • Banning alcohol has, time and time again, led to higher consumption and the formation of criminal gangs, resulting in violent deaths.
  • Abstinence-based education invariably causes a rise in teenage pregnancies.
  • Many people who experience a rodent infestation will stop feeding their cats, assuming that this will encourage them to hunt more. The opposite occurs: well-fed cats are better hunters than hungry ones.
  • When the British government offered financial rewards for people who killed and turned in cobras in India, people, reacting to incentives, began breeding the snakes. Once the reward program was scrapped, the population of cobras in India rose as people released the ones they had raised. The same thing occurred in Vietnam with rats.

This phenomenon, of the outcome being the opposite of the intended one, is known as “blowback” or the Cobra effect, for obvious reasons. Just as with iatrogenics, interventions often lead to worse problems.

Sometimes the consequences are mixed and take a long time to appear, as with the famous Leaning Tower of Pisa. From The Nature of Change again:

When the tower was built, it was undoubtedly intended to stand vertical. It took about 200 years to complete, but by the time the third floor was added, the poor foundations and loose subsoil had allowed it to sink on one side. Subsequent builders tried to correct this lean and the foundations have been stabilised by 20th-century engineering, but at the present time, the top of the tower is still about 15 feet (4.5 meters) from the perpendicular. Along with the unexpected failure of the foundations is the unexpected consequence of the Leaning Tower of Pisa becoming a popular tourist attraction, bringing enormous revenue to the town.

It’s important to note that unintended consequences can sometimes be positive. Someone might have a child because they think parenthood will be a fulfilling experience. If their child grows up and invents a drug that saves thousands of lives, that consequence is positive yet unplanned. Pokemon Go, strange as it seemed, encouraged players to get more exercise. The creation of No Man’s Lands during conflicts can preserve the habitats of local wildlife, as has occurred around the Berlin Wall. Sunken ships form coral reefs where wildlife thrives. Typically, though, when we talk about the law of unintended consequences, we’re talking about negative consequences.

“Any endeavor has unintended consequences. Any ill-conceived endeavor has more.”

— Stephen Tobolowsky, The Dangerous Animals Club

The Causes of Unintended Consequences

By their nature, unintended consequences can be a mystery. I’m not a fan of the term “unintended consequences,” though, as it’s too often a scapegoat for poor thinking. There are always consequences, whether you see them or not.

When we reflect on the roots of consequences that we failed to see but could have, we are liable to build a narrative that packages a series of chaotic events into a neat chain of cause and effect. A chain that means we don’t have to reflect on our decisions to see where we went wrong. A chain that keeps our egos intact.

Sociologist Robert K. Merton has identified five potential causes of consequences we failed to see:

  1. Our ignorance of the precise manner in which systems work.
  2. Analytical errors or a failure to use Bayesian updating (not updating our beliefs in light of new information).
  3. Focusing on short-term gain while forgetting long-term consequences.
  4. The requirement for or prohibition of certain actions, despite the potential long-term results.
  5. The creation of self-defeating prophecies (for example, due to worry about inflation, a central bank announces that it will take drastic action, thereby accidentally causing crippling deflation amidst the panic).

Most unintended consequences are just unanticipated consequences.

Using logical fallacies and mental models, and keeping Schieffelin’s starlings in mind, we can identify several more possible causes of consequences that we likely should have seen in advance but didn’t. Here they are:

Over-reliance on models and predictions—mistaking the map for the territory. Schieffelin could have made a predictive model of how his starlings would breed and would affect their new habitat. The issue is that models are not gospel and the outcomes they predict do not represent the real world. All models are wrong, but that doesn’t mean they’re not useful sometimes. You have to understand the model and the terrain it’s based on. Schieffelin’s predictive model might have told him that the starlings’ breeding habits would have a very minor impact on their new habitat. But in reality, the factors involved were too diverse and complex to take into account. Schieffelin’s starlings bred faster and interacted with their new environment in ways that would be hard to predict. We can assume that he based his estimations of the future of the starlings on their behavior in their native countries.

Survivorship bias. Unintended consequences can also occur when we fail to take into account all of the available information. When predicting an outcome, we have an inherent tendency to search for other instances in which the desired result occurred. Nowadays, when anyone considers introducing a species to a new area, they are likely to hear about Schieffelin’s starlings. And Schieffelin was likely influenced by stories about, perhaps even personal experiences with, successfully introducing birds into new habitats, unaware of the many ecosystem-tampering experiments that had gone horribly wrong.

The compounding effect of consequences. Unintended results do not progress in a linear manner. Just as untouched money in a savings account compounds, the population of Schieffelin’s starlings compounded over the following decades. Each new bird that was hatched meant more hatchlings in future generations. At some point, the bird populations reached critical mass and no attempts to check their growth could be successful. As people in one area shot or poisoned the starlings, the breeding of those elsewhere continued.

Denial. Just as we seek out confirmatory evidence, we are inclined to deny the existence of disconfirming information. We may be in denial about the true implications of actions. Governments, in particular, tend to focus on the positive consequences of legislation while ignoring the costs. Negative unintended consequences do not always result in changes being made. Open-plan offices are another instance; they were first designed to encourage collaboration and creativity. Even though research has shown that they have the opposite effect, many companies continue to opt for open offices. They sound like a good idea, and airy offices with beanbags and pot plants might look nice, but those who continue building them are in obvious denial.

Failure to account for base rates. When we neglect to consider how the past will affect the future, we are failing to account for base rates. Schieffelin likely failed to consider the base rates of successful species introduction.

Curiosity. We sometimes perform actions out of curiosity, without any idea of the potential consequences. The problem is that our curiosity can lead us to behave in reckless, unplanned, or poorly thought-through ways. The release of Schieffelin’s starlings was in part the result of widespread curiosity about the potential for introducing European species to America.

The tendency to want to do something. We are all biased towards action. We don’t want to sit around — we want to act and make changes. The problem is that sometimes doing nothing is the best route to take. In the case of Schieffelin’s starlings, he was biased towards making alterations to the wildlife around him to bring Shakespeare’s world to life, even though leaving nature alone is usually preferable.

Mental Models for Avoiding or Minimizing Unintended Consequences

We cannot eliminate unintended consequences, but we can become more aware of them through rational thinking techniques. In this section, we will examine some ways of working with and understanding the unexpected. Note that the examples provided here are simplifications of complex issues. The observations made about them are those of armchair critics, not those involved in the actual decision making.

Inversion. When we invert our thinking, we consider what we want to avoid, not what we want to cause. Rather than seeking perfection, we should avoid stupidity. By considering potential unintended consequences, we can then work backward. For example, the implementation of laws which required cyclists to wear helmets at all times led to a rise in fatalities. (People who feel safer behave in a more risky manner.) If we use inversion, we know we do not want any change in road safety laws to cause more injuries or deaths. So, we could consider creating stricter laws surrounding risky cycling and enforcing penalties for those who fail to follow them.

Another example is laws which aim to protect endangered animals by preventing new developments on land where rare species live. Imagine that you are a landowner, about to close a lucrative deal. You look out at your land and notice a smattering of endangered wildflowers. Do you cancel the sale and leave the land to the flowers? Of course not. Unless you are exceptionally honest, you grab a spade, dig up the flowers, and keep them a secret. Many people shoot, poison, remove, or otherwise harm endangered animals and plants. If lawmakers used inversion, they would recognize that they want to avoid those consequences and work backward.

We have to focus on avoiding the worst unintended consequences, rather than on controlling everything.

Looking for disconfirming evidence. Instead of looking for information that confirms that our actions will have the desired consequences, we should rigorously search for evidence that they will not. How did this go in the past? Take the example of laws regarding the minimum wage and worker rights. Every country has people pushing for a higher minimum wage and for more protection of workers. If we search for disconfirming evidence, we see that these laws can do more harm than good. The French appear to have perfected labor laws. All employees are, on the face of it, blessed with a minimum wage of 17,764 euros per year, a 35-hour work week, five weeks paid holiday, and strict protection against redundancy (layoffs). So, why don’t we all just move to France? Because these measures result in a lot of negative unintended consequences. Unemployment rates are high, as many businesses cannot afford to hire many employees. Foreign companies are reluctant to hire French workers, as they can’t fire them during tough economic times. Everyone deserves a fair minimum wage and protection from abuse of their rights, but France illustrates how taking this principle too far can have negative unintended consequences.

Understanding our circle of competence. Each of us has areas we understand well and are familiar with. When we act outside our circle of competence, we increase the risk of unintended consequences. If you decide to fix your boiler without consulting a plumber, you are acting outside of your circle of competence and have a good chance of making the problem worse. When the British government implemented bounties for dead cobras in India, their circle of competence did not include an understanding of the locals. Perhaps if they had consulted some Indian people and asked how they would react to such a law, they could have avoided causing a rise in the cobra population.

Second-order thinking. We often forget that our actions can have two layers of consequences, of which the first might be intended and the second unintended. With Schieffelin’s starlings, the first layer of consequences was positive and as intended. The birds survived and bred, and Shakespeare fans living in New York got to feel a bit closer to the iconic playwright. But the negative second layer of consequences dwarfed the first layer. For the parents of a child who grows up to invent a life-saving drug, the first layer of consequences is that those parents (presumably) have a fulfilling experience. The second layer of consequences is that lives are saved. When we use second-order thinking, we ask: what could happen? What if the opposite of what I expect happens? What might the results be a year, five years, or a decade from now?

***

Most unintended consequences are just unanticipated consequences. And in the world of consequences intentions often don’t matter.  Intentions, after all, only apply to positive anticipated consequences. Only in rare circumstances would someone intend to cause negative consequences.

So when we make decisions we must ask what the consequences be? This is where having a toolbox of mental models becomes helpful.

***

Members can discuss this post on the Learning Community Forum