Category: Self-Improvement

Bad Arguments and How to Avoid Them

Productive arguments serve two purposes: to open our minds to truths we couldn’t see — and help others do the same. Here’s how to avoid common pitfalls and argue like a master.

***

We’re often faced with situations in which we need to argue a point, whether we’re pitching an investor or competing for a contract. When being powerfully persuasive matters, it’s important that we don’t use bad arguments that prevent useful debate instead of furthering it. To do this, it’s useful to know some common ways people remove the possibility of a meaningful discussion. While it can be a challenge to keep our cool and not sink to using bad arguments when responding to a Twitter troll or during a heated confrontation over Thanksgiving dinner, we can benefit from knowing what to avoid when the stakes are high.

“If the defendant be a man of straw, who is to pay the costs?” 

— Charles Dickens

 

To start, let’s define three common types of bad arguments, or logical fallacies: “straw man,” “hollow man,” and “iron man.”

Straw man arguments

A straw man argument is a misrepresentation of an opinion or viewpoint, designed to be as easy as possible to refute. Just as a person made of straw would be easier to fight with than a real human, a straw man argument is easy to knock to the ground. And just as it might look a bit like a real person from a distance, a straw man argument has the rough outline of the actual discussion. In some cases, it might seem similar to an outside observer. But it lacks any semblance of substance or strength. The sole purpose is for it to be easy to refute. It’s not an argument you happen to find inconvenient or challenging. It’s one that is logically flawed. A straw man argument may not even be invalid; it’s just not relevant.

It’s important not to confuse a strawman argument with a simplified summary of a complex argument. When we’re having a debate, we may sometimes need to explain an opponent’s grounds back to them to ensure we understand it. In this case, this explanation will be by necessity a briefer version. But it is only a straw man if the simplification is used to make it easier to attack, rather than to facilitate clearer understanding

There are a number of common tactics used to construct straw man arguments. One is per fas et nefas (which means “through right and wrong” in Latin) and involves refuting one of the reasons for an opponent’s argument, then claiming that discredits everything they’ve said. Often, this type of straw man argument will focus on an irrelevant or unimportant detail, selecting the weakest part of the argument. Even though they have no response to the rest of the discourse, they purport to have disproven it in its entirety. As Doug Walton, professor of philosophy at the University of Winnipeg, puts it, “The straw man tactic is essentially to take some small part of an arguer’s position and then treat it as if that represented his larger position, even though it is not really representative of that larger position. It is a form of generalizing from one aspect to a larger, broader position, but not in a representative way.”

Oversimplifying an argument makes it easier to attack by removing any important nuance. An example is the “peanut butter argument,” which states life cannot have evolved through natural selection because we do not see the spontaneous appearance of new life forms inside sealed peanut butter jars. The argument claims evolutionary theory asserts life emerged through a simple combination of matter and heat, both of which are present in a jar of peanut butter. It is a straw man because it uses an incorrect statement about evolution as being representative of the whole theory. The defender of evolution gets trapped into explaining a position they didn’t even have: why life doesn’t spontaneously develop inside a jar of peanut butter.

Another tactic is to over-exaggerate a line of reasoning to the point of absurdity, thus making it easier to refute. An example would be someone claiming a politician who is not opposed to immigration is thus in favor of open borders with no restrictions on who can enter a country. Seeing as that would be a weak view that few people hold, the politician then feels obligated to defend border controls and risks losing control of the debate and being charged as a hypocrite.

“The light obtained by setting straw men on fire is not what we mean by illumination.”

— Adam Gopnik

 

Straw man arguments that respond to irrelevant points could involve ad hominem points, which are sort of relevant but don’t refute the argument—for example, responding to the point that wind turbines are a more environmentally friendly means of generating energy than fossil fuels by saying, “But wind turbines are ugly.” This point has a loose connection, yet the way wind turbines look doesn’t discredit their benefits for power generation. A person who made an ad hominem point like that would likely be doing so because they knew they had no rebuttal for the actual assertion.

Quoting an argument out of context is another tactic of straw man arguments. “Quote mining” is the practice of removing any part of a source that proves contradictory, often using ellipses to fill in the gaps. For instance, film posters and book blurbs will sometimes take quotes from bad reviews out of context to make them seem positive. So, “It’s amazing how bad this film is” becomes “Amazing,” and “The perfect book for people who wish to be bored to tears” becomes “The perfect book.” Reviewers face an uphill battle in trying not to write anything that could be taken out of control in this manner.

Hollow man arguments

A hollow man argument is similar to a straw man one. The difference is that it is a weak case attributed to a non-existent group. Someone will fabricate a viewpoint that is easy to refute, then claim it was made by a group they disagree with. Arguing against an opponent which doesn’t exist is a pretty easy way to win any debate. People who use hollow man arguments will often favor vague, non-specific language without explicitly giving any sources or stating who their opponent is.

Hollow man arguments slip into debate because they’re a lazy way of making a strong point without risking anyone refuting you or needing to be accountable for the actual strength of a line of reasoning. In Why We Argue (And How We Should): A Guide to Political Disagreement, Scott F. Aikin and Robert B. Talisse write that “speakers commit the hollow man when they respond critically to arguments that nobody on the opposing side has ever made. The act of erecting a hollow man is an argumentative failure because it distracts attention away from the actual reasons and argument given by one’s opposition. . . . It is a full-bore fabrication of the opposition.”

An example of a hollow man argument would be the claim that animal rights activists want humans and non-human animals to have a perfectly equal legal standing, meaning that dogs would have to start wearing clothes to avoid being arrested for public indecency. This is a hollow man because no one has said that all laws applying to humans should also apply to dogs.

“The smart way to keep people passive and obedient is to strictly limit the spectrum of acceptable opinion, but allow very lively debate within that spectrum.”

— Noam Chomsky

 

Iron man argument

An iron man argument is one constructed in such a way that it is resistant to attacks by a challenger. Iron man arguments are difficult to avoid because they have a lot of overlap with legitimate debate techniques. The distinction is whether the person using them is doing so to prevent opposition altogether or if they are open to changing their minds and listening to an opposer. Being proven wrong is painful, which is why we often unthinkingly resort to shielding ourselves from it using iron man arguments.

Someone using an iron man argument often makes their own stance so vague that nothing anyone says about it can weaken it. They’ll make liberal use of caveats, jargon, and imprecise terms. This means they can claim anyone who disagrees didn’t understand them, or they’ll rephrase their contention repeatedly. You could compare this to the language used in the average horoscope or in a fortune cookie. It’s so vague that it’s hard to disagree with or label it as incorrect because it can’t be incorrect. It’s like boxing with a wisp of steam.

An example would be a politician who answers a difficult question about their policies by saying, “I think it’s important that we take the best possible actions to benefit the people of this country. Our priority in this situation is to implement policies that have a positive impact on everyone in society.” They’ve answered the question, just without saying anything that anyone could disagree with.

Why bad arguments are harmful

What is the purpose of debate? Most of us, if asked, would say it’s about helping someone with an incorrect, harmful idea see the light. It’s an act of kindness. It’s about getting to the truth.

But the way we tend to engage in debate contradicts our supposed intentions.

Much of the time, we’re really debating because we want to prove we’re right and our opponent is wrong. Our interest is not in getting to the truth. We don’t even consider the possibility that our opponent might be correct or that we could learn something from them.

As decades of psychological research indicate, our brains are always out to save energy, and part of that is that we prefer not to change our minds about anything. It’s much easier to cling to our existing beliefs through whatever means possible and ignore anything that challenges them. Bad arguments enable us to engage in what looks like a debate but doesn’t pose any risk of forcing us to question what we stand for.

We debate for other reasons, too. Sometimes we’re out to entertain ourselves. Or we want to prove we’re smarter than someone else. Or we’re secretly addicted to the shot of adrenaline we get from picking a fight. And that’s what we’re doing—fighting, not arguing. In these cases, it’s no surprise that shoddy tactics like using straw man or hollow man arguments emerge.

It’s never fun to admit we’re wrong about anything or to have to change our minds. But it is essential if we want to get smarter and see the world as it is, not as we want it to be. Any time we engage in debate, we need to be honest about our intentions. What are we trying to achieve? Are we open to changing our minds? Are we listening to our opponent? Only when we’re out to have a balanced discussion with the possibility of changing our minds can a debate be productive, avoiding the use of logical fallacies.

Bad arguments are harmful to everyone involved in a debate. They don’t get us anywhere because we’re not tackling an opponent’s actual viewpoint. This means we have no hope of convincing them. Worse, this sort of underhand tactic is likely to make an opponent feel frustrated and annoyed by the deliberate misrepresentation of their beliefs. They’re forced to listen to a refutation of something they don’t even believe in the first place, which insults their intelligence. Feeling attacked like this only makes them hold on tighter to their actual belief. It may even make them less willing to engage in any sort of debate in the future.

And if you’re a chronic constructor of bad arguments, as many of us are, it leads people to avoid challenging you or starting discussions. Which means you don’t get to learn from them or have your views questioned. In formal situations, using bad arguments makes it look like you don’t really have a strong point in the first place.

How to avoid using bad arguments

If you want to have useful, productive debates, it’s vital to avoid using bad arguments.

The first thing we need to do to avoid constructing bad arguments is to accept it’s something we’re all susceptible to. It’s easy to look at a logical fallacy and think of all the people we know who use it. It’s much harder to recognize it in ourselves. We don’t always realize when the point we’re making isn’t that strong.

Bad arguments are almost unavoidable if we haven’t taken the time to research both sides of the debate. Sometimes the map is not the territory—that is, our perception of an opinion is not that opinion. The most useful thing we can do is attempt to see the territory. That brings us to steelman arguments and the ideological Turing test.

Steel man arguments

The most powerful way to avoid using bad arguments and to discourage their use by others is to follow the principle of charity and to argue against the strongest and most persuasive version of their grounds. In this case, we suspend disbelief and ignore our own opinions for long enough to understand where they’re coming from. We recognize the good sides of their case and play to its strengths. Ask questions to clarify anything you don’t understand. Be curious about the other person’s perspective. You might not change their mind, but you will at least learn something and hopefully reduce any conflict in the process.

“It is better to debate a question without settling it than to settle a question without debating it.”

— Joseph Joubert

 

In Intuition Pumps and Other Tools for Thinking, the philosopher Daniel Dennett offers some general guidelines for using the principle of charity, formulated by social psychologist and game theorist Anatol Rapoport:

  1. You should attempt to re-express your target’s position so clearly, vividly, and fairly that your target says, “Thanks, I wish I’d thought of putting it that way.”
  1. You should list any points of agreement (especially if they are not matters of general or widespread agreement).
  1. You should mention anything you have learned from your target.
  1. Only then are you permitted to say so much as a word of rebuttal or criticism.

An argument that is the strongest version of an opponent’s viewpoint is known as a steel man. It’s purposefully constructed to be as difficult as possible to attack. The idea is that we can only say we’ve won a debate when we’ve fought with a steel man, not a straw one. Seeing as we’re biased towards tackling weaker versions of an argument, often without realizing it, this lets us err on the side of caution.

As challenging as this might be, it serves a bigger picture purpose. Steel man arguments help us understand a new perspective, however ludicrous it might be in our eyes, so we’re better positioned to succeed and connect better in the future. It shows a challenger we are empathetic and willing to listen, regardless of personal opinion. The point is to see the strengths, not the weaknesses. If we’re open-minded, not combative, we can learn a lot.

“He who knows only his side of the case knows little of that.”

— John Stuart Mill

 

An exercise in steel manning, the ideological Turing test, proposes that we cannot say we understand an opponent’s position unless we would be able to argue in favor of it so well that an observer would not be able to tell which opinion we actually hold. In other words, we shouldn’t hold opinions we can’t argue against. The ideological Turing test is a great thought experiment to establish whether you understand where an opponent is coming from.

Although we don’t have the option to do this for every single thing we disagree with, when a debate is extremely important to us, the ideological Turing test can be a helpful tool for ensuring we’re fully prepared. Even if we can’t use it all the time, it can serve us well in high-stakes situations.

How to handle other people using bad arguments

“You could not fence with an antagonist who met rapier thrust with blow of battle axe.”

— L.M. Montgomery

 

Let’s say you’re in the middle of a debate with someone with a different opinion than yours. You’re responding to the steel man version of their explanation, staying calm and measured. But what do you do if your opponent starts using bad arguments against you? What if they’re not listening to you?

The first thing you can do when someone uses a bad argument against you is the simplest: point it out. Explain what they’re doing and why it isn’t helpful. There’s not much point in just telling them they’re using a straw man argument or any other type of logical fallacy. If they’re not familiar with the concept, it may just seem like alienating jargon. There’s also not much point in using it as a “gotcha!” point which will likewise foster more tensions. It’s best to define the concept, then reiterate your actual beliefs and how they differ from the bad argument they’re arguing against.

  1. Edward Damer writes in Attacking Faulty Reasoning, “It is not always possible to know whether an opponent has deliberately distorted your argument or has simply failed to understand or interpret it in the way that you intended. For this reason, it might be helpful to recapitulate the basic outline . . . or [ask] your opponent to summarize it for you.”

If this doesn’t work, you can continue to repeat your original point and make no attempt to defend the bad argument. Should your opponent prove unwilling to recognize their use of a bad argument (and you’re 100% certain that’s what they’re doing), it’s worth considering if there is any point in continuing the debate. The reality is that most of the debates we have are not rationally thought out; they’re emotionally driven. This is even more pertinent when we’re arguing with people we have a complex relationship with. Sometimes, it’s better to walk away.

Conclusion

The bad arguments discussed here are incredibly common logical fallacies in debates. We often use them without realizing it or experience them without recognizing it. But these types of debates are unproductive and unlikely to help anyone learn. If we want our arguments to create buy-in and not animosity, we need to avoid making bad ones.

Why We Focus on Trivial Things: The Bikeshed Effect

Bikeshedding is a metaphor to illustrate the strange tendency we have to spend excessive time on trivial matters, often glossing over important ones. Here’s why we do it, and how to stop.

***

How can we stop wasting time on unimportant details? From meetings at work that drag on forever without achieving anything to weeks-long email chains that don’t solve the problem at hand, we seem to spend an inordinate amount of time on the inconsequential. Then, when an important decision needs to be made, we hardly have any time to devote to it.

To answer this question, we first have to recognize why we get bogged down in the trivial. Then we must look at strategies for changing our dynamics towards generating both useful input and time to consider it.

The Law of Triviality

You’ve likely heard of Parkinson’s Law, which states that tasks expand to fill the amount of time allocated to them. But you might not have heard of the lesser-known Parkinson’s Law of Triviality, also coined by British naval historian and author Cyril Northcote Parkinson in the 1950s.

The Law of Triviality states that the amount of time spent discussing an issue in an organization is inversely correlated to its actual importance in the scheme of things. Major, complex issues get the least discussion while simple, minor ones get the most discussion.

Parkinson’s Law of Triviality is also known as “bike-shedding,” after the story Parkinson uses to illustrate it. He asks readers to imagine a financial committee meeting to discuss a three-point agenda. The points are as follows:

  1. A proposal for a £10 million nuclear power plant
  2. A proposal for a £350 bike shed
  3. A proposal for a £21 annual coffee budget

What happens? The committee ends up running through the nuclear power plant proposal in little time. It’s too advanced for anyone to really dig into the details, and most of the members don’t know much about the topic in the first place. One member who does is unsure how to explain it to the others. Another member proposes a redesigned proposal, but it seems like such a huge task that the rest of the committee decline to consider it.

The discussion soon moves to the bike shed. Here, the committee members feel much more comfortable voicing their opinions. They all know what a bike shed is and what it looks like. Several members begin an animated debate over the best possible material for the roof, weighing out options that might enable modest savings. They discuss the bike shed for far longer than the power plant.

At last, the committee moves onto item three: the coffee budget. Suddenly, everyone’s an expert. They all know about coffee and have a strong sense of its cost and value. Before anyone realizes what is happening, they spend longer discussing the £21 coffee budget than the power plant and the bike shed combined! In the end, the committee runs out of time and decides to meet again to complete their analysis. Everyone walks away feeling satisfied, having contributed to the conversation.

Why this happens

Bike-shedding happens because the simpler a topic is, the more people will have an opinion on it and thus more to say about it. When something is outside of our circle of competence, like a nuclear power plant, we don’t even try to articulate an opinion.

But when something is just about comprehensible to us, even if we don’t have anything of genuine value to add, we feel compelled to say something, lest we look stupid. What idiot doesn’t have anything to say about a bike shed? Everyone wants to show that they know about the topic at hand and have something to contribute.

With any issue, we shouldn’t be according equal importance to every opinion anyone adds. We should emphasize the inputs from those who have done the work to have an opinion. And when we decide to contribute, we should be putting our energy into the areas where we have something valuable to add that will improve the outcome of the decision.

Strategies for avoiding bike-shedding

The main thing you can do to avoid bike-shedding is for your meeting to have a clear purpose. In The Art of Gathering: How We Meet and Why It Matters, Priya Parker, who has decades of experience designing high-stakes gatherings, says that any successful gathering (including a business meeting) needs to have a focused and particular purpose. “Specificity,” she says, “is a crucial ingredient.”

Why is having a clear purpose so critical? Because you use it as the lens to filter all other decisions about your meeting, including who to have in the room.

With that in mind, we can see that it’s probably not a great idea to discuss building a nuclear power plant and a bike shed in the same meeting. There’s not enough specificity there.

The key is to recognize that the available input on an issue doesn’t all need considering. The most informed opinions are most relevant. This is one reason why big meetings with lots of people present, most of whom don’t need to be there, are such a waste of time in organizations. Everyone wants to participate, but not everyone has anything meaningful to contribute.

When it comes to choosing your list of invitees, Parker writes, “if the purpose of your meeting is to make a decision, you may want to consider having fewer cooks in the kitchen.” If you don’t want bike-shedding to occur, avoid inviting contributions from those who are unlikely to have relevant knowledge and experience. Getting the result you want—a thoughtful, educated discussion about that power plant—depends on having the right people in the room.

It also helps to have a designated individual in charge of making the final judgment. When we make decisions by committee with no one in charge, reaching a consensus can be almost impossible. The discussion drags on and on. The individual can decide in advance how much importance to accord to the issue (for instance, by estimating how much its success or failure could help or harm the company’s bottom line). They can set a time limit for the discussion to create urgency. And they can end the meeting by verifying that it has indeed achieved its purpose.

Any issue that invites a lot of discussions from different people might not be the most important one at hand. Avoid descending into unproductive triviality by having clear goals for your meeting and getting the best people to the table to have a productive, constructive discussion.

What You Truly Value

Our devotion to our values gets tested in the face of a true crisis. But it’s also an opportunity to reconnect, recommit, and sometimes, bake some bread.

***

The recent outbreak of the coronavirus is impacting people all over the world — not just in terms of physical health, but financially, emotionally, and even socially. As we struggle to adapt to our new circumstances, it can be tempting to bury our head and wait for it all to blow over so we can just get back to normal. Or we can see this as an incredible opportunity to figure out who we are.

What many of us are discovering right now is that the things we valued a few months ago don’t actually matter: our cars, the titles on our business cards, our privileged neighborhoods. Rather, what is coming to the forefront is a shift to figuring out what we find intrinsically rewarding

When everything is easy, it can seem like you have life figured out. When things change and you’re called to put it into practice, it’s a different level. It’s one thing to say you are stoic when your coffee spills and another entirely when you’re watching your community collapse. When life changes and gets hard, you realize you’ve never had to put into practice what you thought you knew about coping with disaster.

But when a crisis hits, everything is put to the real test.

The challenge then becomes wrapping our struggles into our values, because what we value only has meaning if it’s important when life is hard. To know if they have worth, your values need to help you move forward when you can barely crawl and the obstacles in your way seem insurmountable.

In the face of a crisis, what is important to us becomes evident when we give ourselves the space to reflect on what is going to get us through the hard times. And so we find renewed commitment to get back to core priorities. What seemed important before falls apart to reveal what really matters: family, love, community, health.

“I was 32 when I started cooking; up until then, I just ate.” 

— Julia Child

One unexpected activity that many people are turning to now that they have time and are more introspective is baking. In fact, this week Google searches for bread recipes hit a noticeable high.


Baking is a very physical experience: kneading dough, tasting batter, smelling the results of the ingredients coming together. It’s an activity that requires patience. Bread has to rise. Pies have to cook. Cakes have to cool before they can be covered with icing. And, as prescriptive as baking seems on its surface, it’s something that facilitates creativity as we improvise our ingredients based on what we have in the cupboard. We discover new flavors, and we comfort ourselves and others with the results. Baked goods are often something we share, and in doing so we are providing for those we care about.

Why might baking be useful in times of stress? In Overcoming Anxiety, Dennis Tirch explains “research has demonstrated that when people engage more fully in behaviors that give them a sense of pleasure and mastery, they can begin to overcome negative emotions.”

At home with their loved ones people can reconsider what they value one muffin at a time. Creating with the people we love instead of consuming on our own allows us to focus on what we value as the world changes around us. With more time, slow, seemingly unproductive pursuits have new appeal because they help us reorient to the qualities in life that matter most.

Giving yourself the space to tune in to your values doesn’t have to come through baking. What’s important is that you find an activity that lets you move past fear and panic, to reconnect with what gives your life meaning. When you engage with an activity that gives you pleasure and releases negative emotions, it allows you to rediscover what is important to you.

Change is stressful. But neither stress nor change have to be scary. If you think about it, you undergo moments of change every day because nothing in life is ever static. Our lives are a constant adaptation to a world that is always in motion.

All change brings opportunity. Some change gives us the opportunity to pause and ask what we can do better. How can we better connect to what has proven to be important? Connection is not an abstract intellectual exercise, but an experience that orients us to the values that provide us direction. If you look for opportunities in line with your values, you will be able to see a path through the fear and uncertainty guided by the light that is hope.

The Illusory Truth Effect: Why We Believe Fake News, Conspiracy Theories and Propaganda

When a “fact” tastes good and is repeated enough, we tend to believe it, no matter how false it may be. Understanding the illusory truth effect can keep us from being bamboozled.

***

A recent Verge article looked at some of the unsavory aspects of working as Facebook content moderators—the people who spend their days cleaning up the social network’s most toxic content. One strange detail stands out. The moderators the Verge spoke to reported that they and their coworkers often found themselves believing fringe, often hatemongering conspiracy theories they would have dismissed under normal circumstances. Others described experiencing paranoid thoughts and intense fears for their safety.

An overnight switch from skepticism to fervent belief in conspiracy theories is not unique to content moderators. In a Nieman Lab article by Laura Hazard Owen, she explains that researchers who study the spread of disinformation online can find themselves struggling to be sure about their own beliefs and needing to make an active effort to counteract what they see. Some of the most fervent, passionate conspiracy theorists admit that they first fell into the rabbit hole when they tried to debunk the beliefs they now hold. There’s an explanation for why this happens: the illusory truth effect.

The illusory truth effect

Facts do not cease to exist because they are ignored.

Aldous Huxley

Not everything we believe is true. We may act like it is and it may be uncomfortable to think otherwise, but it’s inevitable that we all hold a substantial number of beliefs that aren’t objectively true. It’s not about opinions or different perspectives. We can pick up false beliefs for the simple reason that we’ve heard them a lot.

If I say that the moon is made of cheese, no one reading this is going to believe that, no matter how many times I repeat it. That statement is too ludicrous. But what about something a little more plausible? What if I said that moon rock has the same density as cheddar cheese? And what if I wasn’t the only one saying it? What if you’d also seen a tweet touting this amazing factoid, perhaps also heard it from a friend at some point, and read it in a blog post?

Unless you’re a geologist, a lunar fanatic, or otherwise in possession of an unusually good radar for moon rock-related misinformation, there is a not insignificant chance you would end up believing a made-up fact like that, without thinking to verify it. You might repeat it to others or share it online. This is how the illusory truth effect works: we all have a tendency to believe something is true after being exposed to it multiple times. The more times we’ve heard something, the truer it seems. The effect is so powerful that repetition can persuade us to believe information we know is false in the first place. Ever thought a product was stupid but somehow you ended up buying it on a regular basis? Or you thought that new manager was okay, but now you participate in gossip about her?

The illusory truth effect is the reason why advertising works and why propaganda is one of the most powerful tools for controlling how people think. It’s why the speech of politicians can be bizarre and multiple-choice tests can cause students problems later on. It’s why fake news spreads and retractions of misinformation don’t work. In this post, we’re going to look at how the illusory truth effect works, how it shapes our perception of the world, and how we can avoid it.

The discovery of the illusory truth effect

Rather than love, than money, than fame, give me truth.

Henry David Thoreau

The illusory truth effect was first described in a 1977 paper entitled “Frequency and the Conference of Referential Validity,” by Lynn Hasher and David Goldstein of Temple University and Thomas Toppino of Villanova University. In the study, the researchers presented a group of students with 60 statements and asked them to rate how certain they were that each was either true or false. The statements came from a range of subjects and were all intended to be not too obscure, but unlikely to be familiar to study participants. Each statement was objective—it could be verified as either correct or incorrect and was not a matter of opinion. For example, “the largest museum in the world is the Louvre in Paris” was true.

Students rated their certainty three times, with two weeks in between evaluations. Some of the statements were repeated each time, while others were not. With each repetition, students became surer of their certainty regarding the statements they labelled as true. It seemed that they were using familiarity as a gauge for how confident they were of their beliefs.

An important detail is that the researchers did not repeat the first and last 10 items on each list. They felt students would be most likely to remember these and be able to research them before the next round of the study. While the study was not conclusive evidence of the existence of the illusory truth effect, subsequent research has confirmed its findings.

Why the illusory truth effect happens

The sad truth is the truth is sad.

Lemony Snicket

Why does repetition of a fact make us more likely to believe it, and to be more certain of that belief? As with other cognitive shortcuts, the typical explanation is that it’s a way our brains save energy. Thinking is hard work—remember that the human brain uses up about 20% of an individual’s energy, despite accounting for just 2% of their body weight.

The illusory truth effect comes down to processing fluency. When a thought is easier to process, it requires our brains to use less energy, which leads us to prefer it. The students in Hasher’s original study recognized the repeated statements, even if not consciously. That means that processing them was easier for their brains.

Processing fluency seems to have a wide impact on our perception of truthfulness. Rolf Reber and Norbert Schwarz, in their article “Effects of Perceptual Fluency on Judgments of Truth,” found that statements presented in an easy-to-read color are judged as more likely to be true than ones presented in a less legible way. In their article “Birds of a Feather Flock Conjointly (?): Rhyme as Reason in Aphorisms,” Matthew S. McGlone and Jessica Tofighbakhsh found that aphorisms that rhyme (like “what sobriety conceals, alcohol reveals”), even if someone hasn’t heard them before, seem more accurate than non-rhyming versions. Once again, they’re easier to process.

Fake news

“One of the saddest lessons of history is this: If we’ve been bamboozled long enough, we tend to reject any evidence of the bamboozle. We’re no longer interested in finding out the truth. The bamboozle has captured us. It’s simply too painful to acknowledge, even to ourselves, that we’ve been taken. ”

— Carl Sagan

The illusory truth effect is one factor in why fabricated news stories sometimes gain traction and have a wide impact. When this happens, our knee-jerk reaction can be to assume that anyone who believes fake news must be unusually gullible or outright stupid. Evan Davis writes in Post Truth, “Never before has there been a stronger sense that fellow citizens have been duped and that we are all suffering the consequences of their intellectual vulnerability.” As Davis goes on to write, this assumption isn’t helpful for anyone. We can’t begin to understand why people believe seemingly ludicrous news stories until we consider some of the psychological reasons why this might happen.

Fake news falls under the umbrella of “information pollution,” which also includes news items that misrepresent information, take it out of context, parody it, fail to check facts or do background research, or take claims from unreliable sources at face value. Some of this news gets published on otherwise credible, well-respected news sites due to simple oversight. Some goes on parody sites that never purport to tell the truth, yet are occasionally mistaken for serious reporting. Some shows up on sites that replicate the look and feel of credible sources, using similar web design and web addresses. And some fake news comes from sites dedicated entirely to spreading misinformation, without any pretense of being anything else.

A lot of information pollution falls somewhere in between the extremes that tend to get the most attention. It’s the result of people being overworked or in a hurry and unable to do the due diligence that reliable journalism requires. It’s what happens when we hastily tweet something or mention it in a blog post and don’t realize it’s not quite true. It extends to miscited quotes, doctored photographs, fiction books masquerading as memoirs, or misleading statistics.

The signal to noise ratio is so skewed that we have a hard time figuring out what to pay attention to and what we should ignore. No one has time to verify everything they read online. No one. (And no, offline media certainly isn’t perfect either.) Our information processing capabilities are not infinite and the more we consume, the harder it becomes to assess its value.

Moreover, we’re often far outside our circle of competence, reading about topics we don’t have the expertise in to assess accuracy in any meaningful way. This drip-drip of information pollution is not harmless. Like air pollution, it builds up over time and the more we’re exposed to it, the more likely we are to end up picking up false beliefs which are then hard to shift. For instance, a lot of people believe that crime, especially the violent kind, is on an upward trend year by year—in a 2016 study by Pew Research, 57% of Americans believed crime had worsened since 2008. This despite violent crime having actually fallen by nearly a fifth during that time. This false belief may stem from the fact that violent crime receives a disproportional amount of media coverage, giving it wide and repeated exposure.

When people are asked to rate the apparent truthfulness of news stories, they score ones they have read multiple times more truthful than those they haven’t. Danielle C. Polage, in her article “Making Up History: False Memories of Fake News Stories,” explains that a false story someone has been exposed to more than once can seem more credible than a true one they’re seeing for the first time. In experimental settings, people also misattribute their previous exposure to stories, believing they read a news item from another source when they actually saw it as part of a prior part of a study. Even when people know the story is part of the experiment, they sometimes think they’ve also read it elsewhere. The repetition is all that matters.

Given enough exposure to contradictory information, there is almost no knowledge that we won’t question.

Propaganda

If a lie is only printed often enough, it becomes a quasi-truth, and if such a truth is repeated often enough, it becomes an article of belief, a dogma, and men will die for it.

Isa Blagden

Propaganda and fake news are similar. By relying on repetition, disseminators of propaganda can change the beliefs and values of people.

Propaganda has a lot in common with advertising, except instead of selling a product or service, it’s about convincing people of the validity of a particular cause. Propaganda isn’t necessarily malicious; sometimes the cause is improved public health or boosting patriotism to encourage military enrollment. But often propaganda is used to undermine political processes to further narrow, radical, and aggressive agendas.

During World War II, the graphic designer Abraham Games served as the official war artist for the British government. Games’s work is iconic and era-defining for its punchy, brightly colored visual style. His army recruitment posters would often feature a single figure rendered in a proud, strong, admirable pose with a mere few words of text. They conveyed to anyone who saw them the sorts of positive qualities they would supposedly gain through military service. Whether this was true or not was another matter. Through repeated exposure to the poster, Games instilled the image the army wanted to create in the minds of viewers, affecting their beliefs and behaviors.

Today, propaganda is more likely to be a matter of quantity over quality. It’s not about a few artistic posters. It’s about saturating the intellectual landscape with content that supports a group’s agenda. With so many demands on our attention, old techniques are too weak.

Researchers Christopher Paul and Miriam Matthews at the Rand Corporation refer to the method of bombarding people with fabricated information as the “firehose of propaganda” model. While the report focuses on modern Russian propaganda, the techniques it describes are not confined to Russia. These techniques make use of the illusory truth effect, alongside other cognitive shortcuts. Firehose propaganda has four distinct features:

  • High-volume and multi-channel
  • Rapid, continuous and repetitive
  • Makes no commitment to objective reality
  • Makes no commitment to consistency

Firehose propaganda is predicated on exposing people to the same messages as frequently as possible. It involves a large volume of content, repeated again and again across numerous channels: news sites, videos, radio, social media, television and so on. These days, as the report describes, this can also include internet users who are paid to repeatedly post in forums, chat rooms, comment sections and on social media disputing legitimate information and spreading misinformation. It is the sheer volume that succeeds in obliterating the truth. Research into the illusory truth effect suggests that we are further persuaded by information heard from multiple sources, hence the efficacy of funneling propaganda through a range of channels.

Seeing as repetition leads to belief in many cases, firehose propaganda doesn’t need to pay attention to the truth or even to be consistent. A source doesn’t need to be credible for us to end up believing its messages. Fact-checking is of little help because it further adds to the repetition, yet we feel compelled not to ignore obviously untrue propagandistic material.

Firehose propaganda does more than spread fake news. It nudges us towards feelings like paranoia, mistrust, suspicion, and contempt for expertise. All of this makes future propaganda more effective. Unlike those espousing the truth, propagandists can move fast because they’re making up some or all of what they claim, meaning they gain a foothold in our minds first.  First impressions are powerful. Familiarity breeds trust.

How to combat the illusory truth effect

So how can we protect ourselves from believing false news and being manipulated by propaganda due to the illusory truth effect? The best route is to be far more selective. The information we consume is like the food we eat. If it’s junk, our thinking will reflect that.

We don’t need to spend as much time reading the news as most of us do. As with many other things in life, more can be less. The vast majority of the news we read is just information pollution. It doesn’t do us any good.

One of the best solutions is to quit the news. This frees up time and energy to engage with timeless wisdom that will improve your life. Try it for a couple of weeks. And if you aren’t convinced, read a few days’ worth of newspapers from 1978. You’ll see how much the news doesn’t really matter at all.

If you can’t quit the news habit, stick to reliable, well-known news sources that have a reputation to uphold. Steer clear of dubious sources whenever you can—even if you treat it as entertainment, you might still end up absorbing it. Research unfamiliar sources before trusting them. Be cautious of sites that are funded entirely by advertising (or that pay their journalists based on views) and seek to support reader-funded news sources you get value from if possible. Prioritize sites that treat their journalists well and don’t expect them to churn out dozens of thoughtless articles per day.  Don’t rely on news in social media posts without sources, from people outside of their circle of competence.

Avoid treating the news as entertainment to passively consume on the bus or while waiting in line. Be mindful about it—if you want to inform yourself on a topic, set aside designated time to learn about it from multiple trustworthy sources. Don’t assume breaking news is better, as it can take some time for the full details of a story to come out and people may be quick to fill in the gaps with misinformation. Accept that you can’t be informed about everything and most of it isn’t important. Pay attention to when news items make you feel outrage or other strong emotions, because this may be a sign of manipulation. Be aware that correcting false information can further fuel the illusory truth effect by adding to the repetition.

We can’t stop the illusory truth effect from existing. But we can recognize that it is a reality and seek to prevent ourselves from succumbing to it in the first place.

Conclusion

Our memories are imperfect. We are easily led astray by the illusory truth effect, which can direct what we believe and even change our understanding of the past. It’s not about intelligence—this happens to all of us. This effect is too powerful for us to override it simply by learning the truth. Cognitively, there is no distinction between a genuine memory and a false one. Our brains are designed to save energy and it’s crucial we accept that.

We can’t just pull back and think the illusory truth only applies to other people. It applies to everyone. We’re all responsible for our own beliefs. We can’t pin the blame on the media or social media algorithms or whatever else. When we put effort into thinking about and questioning the information we’re exposed to, we’re less vulnerable to the illusory truth effect. Knowing about the effect is the best way to identify when it’s distorting our worldview. Before we use information as the basis for important decisions, it’s a good idea to verify if it’s true, or if it’s something we’ve just heard a lot.

Truth is a precarious thing, not because it doesn’t objectively exist, but because the incentives to warp it can be so strong. It’s up to each of us to seek it out.

The Positive Side of Shame

Recently, shame has gotten a bad rap. It’s been branded as toxic and destructive. But shame can be used as a tool to effect positive change.

***

A computer science PhD candidate uncovers significant privacy-violating security flaws in large companies, then shares them with the media to attract negative coverage. Google begins marking unencrypted websites as unsafe, showing a red cross in the URL bar. A nine-year-old girl posts pictures of her school’s abysmal lunches on a blog, leading the local council to step in.

What do each of the aforementioned stories have in common? They’re all examples of shame serving as a tool to encourage structural changes.

Shame, like all emotions, exists because it conferred a meaningful survival advantage for our ancestors. It is a universal experience. The body language associated with shame — inverted shoulders, averted eyes, pursed lips, bowed head, and so on — occurs across cultures. Even blind people exhibit the same body language, indicating it is innate, not learned. We would not waste our time and energy on shame if it wasn’t necessary for survival.

Shame enforces social norms. For our ancestors, the ability to maintain social cohesion was a matter of life or death. Take the almost ubiquitous social rule that states stealing is wrong. If a person is caught stealing, they are likely to feel some degree of shame. While this behavior may not threaten anyone’s survival today, in the past it could have been a sign that a group’s ability to cooperate was in jeopardy. Living in small groups in a harsh environment meant full cooperation was essential.

Through the lens of evolutionary biology, shame evolved to encourage adherence to beneficial social norms. This is backed up by the fact that shame is more prevalent in collectivist societies where people spend little to no time alone than it is in individualistic societies where people live more isolated lives.

Jennifer Jacquet argues in Is Shame Necessary?: New Uses For An Old Tool that we’re not quite through with shame yet. In fact, if we adapt it for the current era, it can help us to solve some of the most pressing problems we face. Shame gives the weak greater power. The difference is that we must shift shame from individuals to institutions, organizations, and powerful individuals. Jacquet states that her book “explores the origins and future of shame. It aims to examine how shaming—exposing a transgressor to public disapproval—a tool many of us find discomforting, might be retrofitted to serve us in new ways.”

Guilt vs. shame

Jacquet begins the book with the story of Sam LaBudde, a young man who in the 1980s became determined to target practices in the tuna-fishing industry leading to the deaths of dolphins. Tuna is often caught with purse seines, a type of large net that encloses around a shoal of fish. Seeing as dolphins tend to swim alongside tuna, they are easily caught in the nets. There, they either die or suffer serious injuries.

LaBudde got a job on a tuna-fishing boat and covertly filmed dolphins dying from their injuries. For months, he hid his true intentions from the crew, spending each day both dreading and hoping for the death of a dolphin. The footage went the 1980s equivalent of viral, showing up in the media all over the world and attracting the attention of major tuna companies.

Still a child at the time, Jacquet was horrified to learn of the consequences of the tuna her family ate. She recalls it as one of her first experiences of shame related to consumption habits. Jacquet persuaded her family to boycott canned tuna altogether. So many others did the same that companies launched the “dolphin-safe” label, which ostensibly indicated compliance with guidelines intended to reduce dolphin deaths. Jacquet returned to eating tuna and thought no more of it.

The campaign to end dolphin deaths in the tuna-fishing industry was futile, however, because it was built upon guilt rather than shame. Jacquet writes, “Guilt is a feeling whose audience and instigator is oneself, and its discomfort leads to self-regulation.” Hearing about dolphin deaths made consumers feel guilty about their fish-buying habits, which conflicted with their ethical values. Those who felt guilty could deal with it by purchasing supposedly dolphin-safe tuna—provided they had the means to potentially pay more and the time to research their choices. A better approach might have been for the videos to focus on tuna companies, giving the names of the largest offenders and calling for specific change in their policies.

But individuals changing their consumption habits did not stop dolphins from dying. It failed to bring about a structural change in the industry. This, Jacquet later realized, was part of a wider shift in environmental action. She explains that it became more about consumers’ choices:

As the focus shifted from supply to demand, shame on the part of corporations began to be overshadowed by guilt on the part of consumers—as the vehicle for solving social and environmental problems. Certification became more and more popular and its rise quietly suggested that responsibility should fall more to the individual consumer rather than to political society. . . . The goal became not to reform entire industries but to alleviate the consciences of a certain sector of consumers.

Shaming, as Jacquet defines it, is about the threat of exposure, whereas guilt is personal. Shame is about the possibility of an audience. Imagine someone were to send a print-out of your internet search history from the last month to your best friend, mother-in-law, partner, or boss. You might not have experienced any guilt making the searches, but even the idea of them being exposed is likely shame-inducing.

Switching the focus of the environmental movement from shame to guilt was, at best, a distraction. It put the responsibility on individuals, even though small actions like turning off the lights count for little. Guilt is a more private emotion, one that arises regardless of exposure. It’s what you feel when you’re not happy about something you did, whereas shame is what you feel when someone finds out. Jacquet writes, “A 2013 research paper showed that just ninety corporations (some of them state-owned) are responsible for nearly two-thirds of historic carbon dioxide and methane emissions; this reminds us that we don’t all share the blame for greenhouse gas emissions.” Guilt doesn’t work because it doesn’t change the system. Taking this into account, Jacquet believes it is time for us to bring back shame, “a tool that can work more quickly and at larger scales.”

The seven habits of effective shaming

So, if you want to use shame as a force for good, as an individual or as part of a group, how can you do so in an effective manner? Jacquet offers seven pointers.

Firstly, “The audience responsible for the shaming should be concerned with the transgression.” It should be something that impacts them so they are incentivized to use shaming to change it. If it has no effect on their lives, they will have little reason to shame. The audience must be the victim. For instance, smoking rates are shrinking in many countries. Part of this may relate to the tendency of non-smokers to shame smokers. The more the former group grows, the greater their power to shame. This works because second-hand smoke impacts their health too, as do indirect tolls like strain on healthcare resources and having to care for ill family members. As Jacquet says, “Shaming must remain relevant to the audience’s norms and moral framework.”

Second, “There should be a big gap between the desired and actual behavior.” The smaller the gap, the less effective the shaming will be. A mugger stealing a handbag from an elderly lady is one thing. A fraudster defrauding thousands of retirees out of their savings is quite another. We are predisposed to fairness in general and become quite riled up when unfairness is significant. In particular, Jacquet observes, we take greater offense when it is the fault of a small group, such as a handful of corporations being responsible for the majority of greenhouse gas emissions. It’s also a matter of contrast. Jacquet cites her own research, which finds that “the degree of ‘bad’ relative to the group matters when it comes to bad apples.” The greater the contrast between the behavior of those being shamed and the rest of the group, the stronger the annoyance will be. For instance, the worse the level of pollution for a corporation is, the more people will shame it.

Third, “Formal punishment should be missing.” Shaming is most effective when it is the sole possible avenue for punishment and the transgression would otherwise go ignored. This ignites our sense of fury at injustice. Jacquet points out that the reason shaming works so well in international politics is that it is often a replacement for formal methods of punishment. If a nation commits major human rights abuses, it is difficult for another nation to use the law to punish them, as they likely have different laws. But revealing and drawing attention to the abuses may shame the nation into stopping, as they do not want to look bad to the rest of the world. When shame is the sole tool we have, we use it best.

Fourth, “The transgressor should be sensitive to the source of shaming.” The shamee must consider themselves subject to the same social norms as the shamer. Shaming an organic grocery chain for stocking unethically produced meat would be far more effective than shaming a fast-food chain for the same thing. If the transgressor sees themselves as subject to different norms, they are unlikely to be concerned.

Fifth, “The audience should trust the source of the shaming.” The shaming must come from a respectable, trustworthy, non-hypocritical source. If it does not, its impact is likely to be minimal. A news outlet that only shames one side of the political spectrum on a cross-spectrum issue isn’t going to have much impact.

Sixth, “Shaming should be directed where possible benefits are greatest.” We all have a limited amount of attention and interest in shaming. It should only be applied where it can have the greatest possible benefits and used sparingly, on the most serious transgressions. Otherwise, people will become desensitized, and the shaming will be ineffective. Wherever possible, we should target shaming at institutions, not individuals. Effective shaming focuses on the powerful, not the weak.

Seventh, “Shaming should be scrupulously implemented” Shaming needs to be carried out consistently. The threat can be more useful than the act itself, hence why it may need implementing on a regular basis. For instance, an annual report on the companies guilty of the most pollution is more meaningful than a one-off one. Companies know to anticipate it and preemptively change their behavior. Jacquet explains that “shame’s performance is optimized when people reform their behavior in response to its threat and remain part of the group. . . . Ideally, shaming creates some friction but ultimately heals without leaving a scar.”

To summarize, Jacquet writes: “When shame works without destroying anyone’s life, when it leads to reform and reintegration rather than fight or flight, or, even better, when it acts as a deterrent against bad behavior, shaming is performing optimally.”

***

Due to our negative experiences with shame on a personal level, we may be averse to viewing it in the light Jacquet describes: as an important and powerful tool. But “shaming, like any tool, is on its own amoral and can be used to any end, good or evil.” The way we use it is what matters.

According to Jacquet, we should not use shame to target transgressions that have minimal impact or are the fault of individuals with little power. We should use it when the outcome will be a broader benefit for society and when formal means of punishment have been exhausted. It’s important the shaming be proportional and done intentionally, not as a means of vindication.

Is Shame Necessary? is a thought-provoking read and a reminder of the power we have as individuals to contribute to meaningful change to the world. One way is to rethink how we view shame.

The Inner Game: Why Trying Too Hard Can Be Counterproductive

The standard way of learning is far from being the fastest or most enjoyable. It’s slow, makes us second guess ourselves, and interferes with our natural learning process. Here we explore a better way to learn and enjoy the process.

***

It’s the final moment before an important endeavor—a speech, a performance, a presentation, an interview, a date, or perhaps a sports match. Up until now, you’ve felt good and confident about your abilities. But suddenly, something shifts. You feel a wave of self-doubt. You start questioning how well you prepared. The urge to run away and sabotage the whole thing starts bubbling to the surface.

As hard as you try to overcome your inexplicable insecurity, something tells you that you’ve already lost. And indeed, things don’t go well. You choke up, forget what you were meaning to say, long to just walk out, or make silly mistakes. None of this comes as a surprise—you knew beforehand that something had gone wrong in your mind. You just don’t know why.

Conversely, perhaps you’ve been in a situation where you knew you’d succeeded before you even began. You felt confident and in control. Your mind could focus with ease, impervious to self-doubt or distraction. Obstacles melted away, and abilities you never knew you possessed materialized.

This phenomenon—winning or losing something in your mind before you win or lose it in reality—is what tennis player and coach W. Timothy Gallwey first called “the Inner Game” in his book The Inner Game of Tennis. Gallwey wrote the book in the 1970s when people viewed sport as a purely physical matter. Athletes focused on their muscles, not their mindsets. Today, we know that psychology is in fact of the utmost importance.

Gallwey recognized that physical ability was not the full picture in any sport. In tennis, success is very psychological because there are really two games going on: the Inner Game and the Outer Game. If a player doesn’t pay attention to how they play the Inner Game—against their insecurities, their wandering mind, their self-doubt and uncertainty—they will never be as good as they have the potential to be. The Inner Game is fought against your own self-defeating tendencies, not against your actual opponent. Gallwey writes in the introduction:

Every game is composed of two parts, an outer game, and an inner game. . . . It is the thesis of this book that neither mastery nor satisfaction can be found in the playing of any game without giving some attention to the relatively neglected skills of the inner game. This is the game that takes place in the mind of the player, and it is played against such obstacles as lapses in concentration, nervousness, self-doubt, and self-condemnation. In short, it is played to overcome all habits of mind which inhibit excellence in performance. . . . Victories in the inner game may provide no additions to the trophy case, but they bring valuable rewards which are more permanent and which can contribute significantly to one’s success, off the court as well as on.

Ostensibly, The Inner Game of Tennis is a book about tennis. But dig beneath the surface, and it teems with techniques and insights we can apply to any challenge. The book is really about overcoming the external obstacles we create that prevent us from succeeding. You don’t need to be interested in tennis or even know anything about it to benefit from this book.

One of the most important insights Gallwey shares is that a major thing which leads us to lose the Inner Game is trying too hard and interfering with our own natural learning capabilities. Let’s take a look at how we can win the Inner Game in our own lives by seeing the importance of not forcing things.

The Two Sides of You

Gallwey was not a psychologist. But his experience as both a tennis player and a coach for other players gave him a deep understanding of how human psychology influences playing. The tennis court was his laboratory. As is evident throughout The Inner Game of Tennis, he studied himself, his students, and opponents with care. He experimented and tested out theories until he uncovered the best teaching techniques.

When we’re learning something new, we often internally talk to ourselves. We give ourselves instructions. When Gallwey noticed this in his students, he wondered who was talking to who. From his observations, he drew his key insight: the idea of Self 1 and Self 2.

Self 1 is the conscious self. Self 2 is the subconscious. The two are always in dialogue.

If both selves can communicate in harmony, the game will go well. More often, this isn’t what happens. Self 1 gets judgmental and critical, trying to instruct Self 2 in what to do. The trick is to quiet Self 1 and let Self 2 follow the natural learning process we are all born competent at; this is the process that enables us to learn as small children. This capacity is within us—we just need to avoid impeding it. As Gallwey explains:

Now we are ready for the first major postulate of the Inner Game: within each player the kind of relationship that exists between Self 1 and Self 2 is the prime factor in determining one’s ability to translate his knowledge of technique into effective action. In other words, the key to better tennis—or better anything—lies in improving the relationship between the conscious teller, Self 1, and the natural capabilities of Self 2.

Self 1 tries to instruct Self 2 using words. But Self 2 responds best to images and internalizing the physical experience of carrying out the desired action.

In short, if we let ourselves lose touch with our ability to feel our actions, by relying too heavily on instructions, we can seriously compromise our access to our natural learning processes and our potential to perform.

Stop Trying so Hard

Gallwey writes that “great music and art are said to arise from the quiet depths of the unconscious, and true expressions of love are said to come from a source which lies beneath words and thoughts. So it is with the greatest efforts in sports; they come when the mind is as still as a glass lake.”

What’s the most common piece of advice you’re likely to receive for getting better at something? Try harder. Work harder. Put more effort in. Pay more attention to what you’re doing. Do more.

Yet what do we experience when we are performing at our best? The exact opposite. Everything becomes effortless. We act without thinking or even giving ourselves time to think. We stop judging our actions as good or bad and observe them as they are. Colloquially, we call this being in the zone. In psychology, it’s known as “flow” or a “peak experience.”

Compare this to the typical tennis lesson. As Gallwey describes it, the teacher wants the student to feel that the cost of the lesson was worthwhile. So they give detailed, continuous feedback. Every time they spot the slightest flaw, they highlight it. The result is that the student does indeed feel the lesson fee is justifiable. They’re now aware of dozens of errors they need to fix—so they book more classes.

In his early days as a tennis coach, Gallwey took this approach. Over time, he saw that when he stepped back and gave his students less feedback, not more, they improved faster. Players would correct obvious mistakes without any guidance. On some deeper level, they knew the correct way to play tennis. They just needed to overcome the habits of the mind getting in the way. Whatever impeded them was not a lack of information. Gallwey writes:

I was beginning to learn what all good pros and students of tennis must learn: that images are better than words, showing better than telling, too much instruction worse than none, and that trying too hard often produces negative results.

There are numerous instances outside of sports when we can see how trying too hard can backfire. Consider a manager who feels the need to constantly micromanage their employees and direct every detail of their work, not allowing any autonomy or flexibility. As a result, the employees lose interest in ever taking initiative or directing their own work. Instead of getting the perfect work they want, the manager receives lackluster efforts.

Or consider a parent who wants their child to do well at school, so they control their studying schedule, limit their non-academic activities, and offer enticing rewards for good grades. It may work in the short term, but in the long run, the child doesn’t learn to motivate themselves or develop an intrinsic desire to study. Once their parent is no longer breathing down their neck, they don’t know how to learn.

Positive Thinking Backfires

Not only are we often advised to try harder to improve our skills, we’re also encouraged to think positively. According to Gallwey, when it comes to winning the Inner Game, this is the wrong approach altogether.

To quiet Self 1, we need to stop attaching judgments to our performance, either positive or negative. Thinking of, say, a tennis serve as “good” or “bad” shuts down Self 2’s intuitive sense of what to do. Gallwey noticed that “judgment results in tightness and tightness interferes with the fluidity required for accurate and quick movement. Relaxation produces smooth strokes and results from accepting your strokes as they are, even if erratic.”

In order to let Self 2’s sense of the correct action take over, we need to learn to see our actions as they are. We must focus on what is happening, not what is right or wrong. Once we can see clearly, we can tap into our inbuilt learning process, as Gallwey explains:

But to see things as they are, we must take off our judgmental glasses, whether they’re dark or rose-tinted. This action unlocks a process of natural development, which is as surprising as it is beautiful. . . . The first step is to see your strokes as they are. They must be perceived clearly. This can be done only when personal judgment is absent. As soon as a stroke is seen clearly and accepted as it is, a natural and speedy process of change begins.

It’s hard to let go of judgments when we can’t or won’t trust ourselves. Gallwey noticed early on that negative assessments—telling his students what they had done wrong—didn’t seem to help them. He tried only making positive assessments—telling them what they were doing well. Eventually, Gallwey recognized that attaching any sort of judgment to how his students played tennis was detrimental.

Positive and negative evaluations are two sides of the same coin. To say something is good is to implicitly imply its inverse is bad. When Self 1 hears praise, Self 2 picks up on the underlying criticism.

Clearly, positive and negative evaluations are relative to each other. It is impossible to judge one event as positive without seeing other events as not positive or negative. There is no way to stop just the negative side of the judgmental process.

The trick may be to get out of the binary of good or bad completely by doing more showing and asking questions like “Why did the ball go that way?” or “What are you doing differently now than you did last time?” Sometimes, getting people to articulate how they are doing by observing their own performance removes the judgments and focuses on developmental possibilities. When we have the right image in mind, we move toward it naturally. Value judgments get in the way of that process.

The Inner Game Way of Learning

We’re all constantly learning and picking up new skills. But few of us pay much attention to how we learn and whether we’re doing it in the best possible way. Often, what we think of as “learning” primarily involves berating ourselves for our failures and mistakes, arguing with ourselves, and not using the most effective techniques. In short, we try to brute-force ourselves into adopting a capability. Gallwey describes the standard way of learning as such:

Step 1: Criticize or judge past behavior.

Step 2: Tell yourself to change, instructing with word commands repeatedly.

Step 3: Try hard; make yourself do it right.

Step 4: Critical judgment about results leading to Self 1 vicious cycle.

The standard way of learning is far from being the fastest or most enjoyable. It’s slow, it makes us feel awful about ourselves, and it interferes with our natural learning process. Instead, Gallwey advocates following the Inner Game way of learning.

First, we must observe our existing behavior without attaching any judgment to it. We must see what is, not what we think it should be. Once we are aware of what we are doing, we can move onto the next step: picturing the desired outcome. Gallwey advocates images over outright commands because he believes visualizing actions is the best way to engage Self 2’s natural learning capabilities. The next step is to trust Self 2 and “let it happen!” Once we have the right image in mind, Self 2 can take over—provided we do not interfere by trying too hard to force our actions. The final step is to continue “nonjudgmental, calm observation of the results” in order to repeat the cycle and keep learning. It takes nonjudgmental observation to unlearn bad habits.

Conclusion

Towards the end of the book, Gallwey writes:

Clearly, almost every human activity involves both the outer and inner games. There are always external obstacles between us and our external goals, whether we are seeking wealth, education, reputation, friendship, peace on earth or simply something to eat for dinner. And the inner obstacles are always there; the very mind we use in obtaining our external goals is easily distracted by its tendency to worry, regret, or generally muddle the situation, thereby causing needless difficulties within.

Whatever we’re trying to achieve, it would serve us well to pay more attention to the internal, not just the external. If we can overcome the instinct to get in our own way and be more comfortable trusting in our innate abilities, the results may well be surprising.