Tag: Truth

The Illusory Truth Effect: Why We Believe Fake News, Conspiracy Theories and Propaganda

When a “fact” tastes good and is repeated enough, we tend to believe it, no matter how false it may be. Understanding the illusory truth effect can keep us from being bamboozled.

***

A recent Verge article looked at some of the unsavory aspects of working as Facebook content moderators—the people who spend their days cleaning up the social network’s most toxic content. One strange detail stands out. The moderators the Verge spoke to reported that they and their coworkers often found themselves believing fringe, often hatemongering conspiracy theories they would have dismissed under normal circumstances. Others described experiencing paranoid thoughts and intense fears for their safety.

An overnight switch from skepticism to fervent belief in conspiracy theories is not unique to content moderators. In a Nieman Lab article by Laura Hazard Owen, she explains that researchers who study the spread of disinformation online can find themselves struggling to be sure about their own beliefs and needing to make an active effort to counteract what they see. Some of the most fervent, passionate conspiracy theorists admit that they first fell into the rabbit hole when they tried to debunk the beliefs they now hold. There’s an explanation for why this happens: the illusory truth effect.

The illusory truth effect

Facts do not cease to exist because they are ignored.

Aldous Huxley

Not everything we believe is true. We may act like it is and it may be uncomfortable to think otherwise, but it’s inevitable that we all hold a substantial number of beliefs that aren’t objectively true. It’s not about opinions or different perspectives. We can pick up false beliefs for the simple reason that we’ve heard them a lot.

If I say that the moon is made of cheese, no one reading this is going to believe that, no matter how many times I repeat it. That statement is too ludicrous. But what about something a little more plausible? What if I said that moon rock has the same density as cheddar cheese? And what if I wasn’t the only one saying it? What if you’d also seen a tweet touting this amazing factoid, perhaps also heard it from a friend at some point, and read it in a blog post?

Unless you’re a geologist, a lunar fanatic, or otherwise in possession of an unusually good radar for moon rock-related misinformation, there is a not insignificant chance you would end up believing a made-up fact like that, without thinking to verify it. You might repeat it to others or share it online. This is how the illusory truth effect works: we all have a tendency to believe something is true after being exposed to it multiple times. The more times we’ve heard something, the truer it seems. The effect is so powerful that repetition can persuade us to believe information we know is false in the first place. Ever thought a product was stupid but somehow you ended up buying it on a regular basis? Or you thought that new manager was okay, but now you participate in gossip about her?

The illusory truth effect is the reason why advertising works and why propaganda is one of the most powerful tools for controlling how people think. It’s why the speech of politicians can be bizarre and multiple-choice tests can cause students problems later on. It’s why fake news spreads and retractions of misinformation don’t work. In this post, we’re going to look at how the illusory truth effect works, how it shapes our perception of the world, and how we can avoid it.

The discovery of the illusory truth effect

Rather than love, than money, than fame, give me truth.

Henry David Thoreau

The illusory truth effect was first described in a 1977 paper entitled “Frequency and the Conference of Referential Validity,” by Lynn Hasher and David Goldstein of Temple University and Thomas Toppino of Villanova University. In the study, the researchers presented a group of students with 60 statements and asked them to rate how certain they were that each was either true or false. The statements came from a range of subjects and were all intended to be not too obscure, but unlikely to be familiar to study participants. Each statement was objective—it could be verified as either correct or incorrect and was not a matter of opinion. For example, “the largest museum in the world is the Louvre in Paris” was true.

Students rated their certainty three times, with two weeks in between evaluations. Some of the statements were repeated each time, while others were not. With each repetition, students became surer of their certainty regarding the statements they labelled as true. It seemed that they were using familiarity as a gauge for how confident they were of their beliefs.

An important detail is that the researchers did not repeat the first and last 10 items on each list. They felt students would be most likely to remember these and be able to research them before the next round of the study. While the study was not conclusive evidence of the existence of the illusory truth effect, subsequent research has confirmed its findings.

Why the illusory truth effect happens

The sad truth is the truth is sad.

Lemony Snicket

Why does repetition of a fact make us more likely to believe it, and to be more certain of that belief? As with other cognitive shortcuts, the typical explanation is that it’s a way our brains save energy. Thinking is hard work—remember that the human brain uses up about 20% of an individual’s energy, despite accounting for just 2% of their body weight.

The illusory truth effect comes down to processing fluency. When a thought is easier to process, it requires our brains to use less energy, which leads us to prefer it. The students in Hasher’s original study recognized the repeated statements, even if not consciously. That means that processing them was easier for their brains.

Processing fluency seems to have a wide impact on our perception of truthfulness. Rolf Reber and Norbert Schwarz, in their article “Effects of Perceptual Fluency on Judgments of Truth,” found that statements presented in an easy-to-read color are judged as more likely to be true than ones presented in a less legible way. In their article “Birds of a Feather Flock Conjointly (?): Rhyme as Reason in Aphorisms,” Matthew S. McGlone and Jessica Tofighbakhsh found that aphorisms that rhyme (like “what sobriety conceals, alcohol reveals”), even if someone hasn’t heard them before, seem more accurate than non-rhyming versions. Once again, they’re easier to process.

Fake news

“One of the saddest lessons of history is this: If we’ve been bamboozled long enough, we tend to reject any evidence of the bamboozle. We’re no longer interested in finding out the truth. The bamboozle has captured us. It’s simply too painful to acknowledge, even to ourselves, that we’ve been taken. ”

— Carl Sagan

The illusory truth effect is one factor in why fabricated news stories sometimes gain traction and have a wide impact. When this happens, our knee-jerk reaction can be to assume that anyone who believes fake news must be unusually gullible or outright stupid. Evan Davis writes in Post Truth, “Never before has there been a stronger sense that fellow citizens have been duped and that we are all suffering the consequences of their intellectual vulnerability.” As Davis goes on to write, this assumption isn’t helpful for anyone. We can’t begin to understand why people believe seemingly ludicrous news stories until we consider some of the psychological reasons why this might happen.

Fake news falls under the umbrella of “information pollution,” which also includes news items that misrepresent information, take it out of context, parody it, fail to check facts or do background research, or take claims from unreliable sources at face value. Some of this news gets published on otherwise credible, well-respected news sites due to simple oversight. Some goes on parody sites that never purport to tell the truth, yet are occasionally mistaken for serious reporting. Some shows up on sites that replicate the look and feel of credible sources, using similar web design and web addresses. And some fake news comes from sites dedicated entirely to spreading misinformation, without any pretense of being anything else.

A lot of information pollution falls somewhere in between the extremes that tend to get the most attention. It’s the result of people being overworked or in a hurry and unable to do the due diligence that reliable journalism requires. It’s what happens when we hastily tweet something or mention it in a blog post and don’t realize it’s not quite true. It extends to miscited quotes, doctored photographs, fiction books masquerading as memoirs, or misleading statistics.

The signal to noise ratio is so skewed that we have a hard time figuring out what to pay attention to and what we should ignore. No one has time to verify everything they read online. No one. (And no, offline media certainly isn’t perfect either.) Our information processing capabilities are not infinite and the more we consume, the harder it becomes to assess its value.

Moreover, we’re often far outside our circle of competence, reading about topics we don’t have the expertise in to assess accuracy in any meaningful way. This drip-drip of information pollution is not harmless. Like air pollution, it builds up over time and the more we’re exposed to it, the more likely we are to end up picking up false beliefs which are then hard to shift. For instance, a lot of people believe that crime, especially the violent kind, is on an upward trend year by year—in a 2016 study by Pew Research, 57% of Americans believed crime had worsened since 2008. This despite violent crime having actually fallen by nearly a fifth during that time. This false belief may stem from the fact that violent crime receives a disproportional amount of media coverage, giving it wide and repeated exposure.

When people are asked to rate the apparent truthfulness of news stories, they score ones they have read multiple times more truthful than those they haven’t. Danielle C. Polage, in her article “Making Up History: False Memories of Fake News Stories,” explains that a false story someone has been exposed to more than once can seem more credible than a true one they’re seeing for the first time. In experimental settings, people also misattribute their previous exposure to stories, believing they read a news item from another source when they actually saw it as part of a prior part of a study. Even when people know the story is part of the experiment, they sometimes think they’ve also read it elsewhere. The repetition is all that matters.

Given enough exposure to contradictory information, there is almost no knowledge that we won’t question.

Propaganda

If a lie is only printed often enough, it becomes a quasi-truth, and if such a truth is repeated often enough, it becomes an article of belief, a dogma, and men will die for it.

Isa Blagden

Propaganda and fake news are similar. By relying on repetition, disseminators of propaganda can change the beliefs and values of people.

Propaganda has a lot in common with advertising, except instead of selling a product or service, it’s about convincing people of the validity of a particular cause. Propaganda isn’t necessarily malicious; sometimes the cause is improved public health or boosting patriotism to encourage military enrollment. But often propaganda is used to undermine political processes to further narrow, radical, and aggressive agendas.

During World War II, the graphic designer Abraham Games served as the official war artist for the British government. Games’s work is iconic and era-defining for its punchy, brightly colored visual style. His army recruitment posters would often feature a single figure rendered in a proud, strong, admirable pose with a mere few words of text. They conveyed to anyone who saw them the sorts of positive qualities they would supposedly gain through military service. Whether this was true or not was another matter. Through repeated exposure to the poster, Games instilled the image the army wanted to create in the minds of viewers, affecting their beliefs and behaviors.

Today, propaganda is more likely to be a matter of quantity over quality. It’s not about a few artistic posters. It’s about saturating the intellectual landscape with content that supports a group’s agenda. With so many demands on our attention, old techniques are too weak.

Researchers Christopher Paul and Miriam Matthews at the Rand Corporation refer to the method of bombarding people with fabricated information as the “firehose of propaganda” model. While the report focuses on modern Russian propaganda, the techniques it describes are not confined to Russia. These techniques make use of the illusory truth effect, alongside other cognitive shortcuts. Firehose propaganda has four distinct features:

  • High-volume and multi-channel
  • Rapid, continuous and repetitive
  • Makes no commitment to objective reality
  • Makes no commitment to consistency

Firehose propaganda is predicated on exposing people to the same messages as frequently as possible. It involves a large volume of content, repeated again and again across numerous channels: news sites, videos, radio, social media, television and so on. These days, as the report describes, this can also include internet users who are paid to repeatedly post in forums, chat rooms, comment sections and on social media disputing legitimate information and spreading misinformation. It is the sheer volume that succeeds in obliterating the truth. Research into the illusory truth effect suggests that we are further persuaded by information heard from multiple sources, hence the efficacy of funneling propaganda through a range of channels.

Seeing as repetition leads to belief in many cases, firehose propaganda doesn’t need to pay attention to the truth or even to be consistent. A source doesn’t need to be credible for us to end up believing its messages. Fact-checking is of little help because it further adds to the repetition, yet we feel compelled not to ignore obviously untrue propagandistic material.

Firehose propaganda does more than spread fake news. It nudges us towards feelings like paranoia, mistrust, suspicion, and contempt for expertise. All of this makes future propaganda more effective. Unlike those espousing the truth, propagandists can move fast because they’re making up some or all of what they claim, meaning they gain a foothold in our minds first.  First impressions are powerful. Familiarity breeds trust.

How to combat the illusory truth effect

So how can we protect ourselves from believing false news and being manipulated by propaganda due to the illusory truth effect? The best route is to be far more selective. The information we consume is like the food we eat. If it’s junk, our thinking will reflect that.

We don’t need to spend as much time reading the news as most of us do. As with many other things in life, more can be less. The vast majority of the news we read is just information pollution. It doesn’t do us any good.

One of the best solutions is to quit the news. This frees up time and energy to engage with timeless wisdom that will improve your life. Try it for a couple of weeks. And if you aren’t convinced, read a few days’ worth of newspapers from 1978. You’ll see how much the news doesn’t really matter at all.

If you can’t quit the news habit, stick to reliable, well-known news sources that have a reputation to uphold. Steer clear of dubious sources whenever you can—even if you treat it as entertainment, you might still end up absorbing it. Research unfamiliar sources before trusting them. Be cautious of sites that are funded entirely by advertising (or that pay their journalists based on views) and seek to support reader-funded news sources you get value from if possible. Prioritize sites that treat their journalists well and don’t expect them to churn out dozens of thoughtless articles per day.  Don’t rely on news in social media posts without sources, from people outside of their circle of competence.

Avoid treating the news as entertainment to passively consume on the bus or while waiting in line. Be mindful about it—if you want to inform yourself on a topic, set aside designated time to learn about it from multiple trustworthy sources. Don’t assume breaking news is better, as it can take some time for the full details of a story to come out and people may be quick to fill in the gaps with misinformation. Accept that you can’t be informed about everything and most of it isn’t important. Pay attention to when news items make you feel outrage or other strong emotions, because this may be a sign of manipulation. Be aware that correcting false information can further fuel the illusory truth effect by adding to the repetition.

We can’t stop the illusory truth effect from existing. But we can recognize that it is a reality and seek to prevent ourselves from succumbing to it in the first place.

Conclusion

Our memories are imperfect. We are easily led astray by the illusory truth effect, which can direct what we believe and even change our understanding of the past. It’s not about intelligence—this happens to all of us. This effect is too powerful for us to override it simply by learning the truth. Cognitively, there is no distinction between a genuine memory and a false one. Our brains are designed to save energy and it’s crucial we accept that.

We can’t just pull back and think the illusory truth only applies to other people. It applies to everyone. We’re all responsible for our own beliefs. We can’t pin the blame on the media or social media algorithms or whatever else. When we put effort into thinking about and questioning the information we’re exposed to, we’re less vulnerable to the illusory truth effect. Knowing about the effect is the best way to identify when it’s distorting our worldview. Before we use information as the basis for important decisions, it’s a good idea to verify if it’s true, or if it’s something we’ve just heard a lot.

Truth is a precarious thing, not because it doesn’t objectively exist, but because the incentives to warp it can be so strong. It’s up to each of us to seek it out.

B.H. Liddell Hart and the Study of Truth and History

B.H. Liddell Hart (1895-1970) was many things, but above all, he was a military historian. He wrote tracts on Sherman, Scipio, Rommel, and on military strategy itself. His work influenced Neville Chamberlain and may have even (accidentally) influenced the German army’s blitzkrieg tactic in WWII.

What’s beautiful about Hart’s writing is his insight into human nature as seen through the lens of war. Hart’s experience both studying wars and participating in them — he was a British officer in World War I and present for both World War II and a large portion of the Cold War — gave him wide perspective on the ultimate human folly.

Hart summed up much of his wisdom in a short treatise called Why Don’t we Learn from History?, which he unfortunately left unfinished at his death. In the preface to the book, Hart’s son Adrian sums up his father’s approach to life:

He believed in the importance of the truth that man could, by rational process discover the truth about himself—and about life; that this discovery was without value unless it was expressed and unless its expression resulted in action as well as education. To this end he valued accuracy and lucidity. He valued, perhaps even more, the moral courage to pursue and propagate truths which might be unpopular or detrimental to one’s own or other people’s immediate interests. He recognized that this discovery could best be fostered under certain political and social conditions—which therefore became to him of paramount importance.

Why study history at all? Hart asks us this rhetorically, early on in the book, and replies with a simple answer: Because it teaches us what not to do. How to avoid being stupid:

What is the object of history? I would answer, quite simply—“truth.” It is a word and an idea that has gone out of fashion. But the results of discounting the possibility of reaching the truth are worse than those of cherishing it. The object might be more cautiously expressed thus: to find out what happened while trying to find out why it happened. In other words, to seek the causal relations between events. History has limitations as guiding signpost, however, for although it can show us the right direction, it does not give detailed information about the road conditions.

But its negative value as a warning sign is more definite. History can show us what to avoid, even if it does not teach us what to do—by showing the most common mistakes that mankind is apt to make and to repeat. A second object lies in the practical value of history. “Fools,” said Bismarck, “say they learn by experience. I prefer to profit by other people’s experience.”

The study of history offers that opportunity in the widest possible measure. It is universal experience—infinitely longer, wider, and more varied than any individual’s experience. How often do people claim superior wisdom on the score of their age and experience. The Chinese especially regard age with veneration, and hold that a man of eighty years or more must be wiser than others. But eighty is nothing for a student of history. There is no excuse for anyone who is not illiterate if he is less than three thousand years old in mind.

[…]

History is the record of man’s steps and slips. It shows us that the steps have been slow and slight; the slips, quick and abounding. It provides us with the opportunity to profit by the stumbles and tumbles of our forerunners. Awareness of our limitations should make us chary of condemning those who made mistakes, but we condemn ourselves if we fail to recognize mistakes.

There is a too common tendency to regard history as a specialist subject— that is the primary mistake. For, on the contrary, history is the essential corrective to all specialization. Viewed aright, it is the broadest of studies, embracing every aspect of life. It lays the foundation of education by showing how mankind repeats its errors and what those errors are.

Later, Hart expounds further on the value of truth, the value of finding out what’s actually going on as opposed to what one wishes was the case. Hart agrees with the idea that one should recognize reality especially when it makes one uncomfortable, as Darwin was able to do so effectively. If we forget or mask our mistakes, we are doomed to continue making them.

We learn from history that men have constantly echoed the remark ascribed to Pontius Pilate—“What is truth?” And often in circumstances that make us wonder why. It is repeatedly used as a smoke screen to mask a maneuver, personal or political, and to cover an evasion of the issue. It may be a justifiable question in the deepest sense. Yet the longer I watch current events, the more I have come to see how many of our troubles arise from the habit, on all sides, of suppressing or distorting what we know quite well is the truth, out of devotion to a cause, an ambition, or an institution—at bottom, this devotion being inspired by our own interest.

[…]

We learn from history that in every age and every clime the majority of people have resented what seems in retrospect to have been purely matter-of-fact comment on their institutions. We learn too that nothing has aided the persistence of falsehood, and the evils resulting from it, more than the unwillingness of good people to admit the truth when it was disturbing to their comfortable assurance. Always the tendency continues to be shocked by natural comment and to hold certain things too “sacred” to think about.

I can conceive of no finer ideal of a man’s life than to face life with clear eyes instead of stumbling through it like a blind man, an imbecile, or a drunkard—which, in a thinking sense, is the common preference. How rarely does one meet anyone whose first reaction to anything is to ask “Is it true?” Yet unless that is a man’s natural reaction it shows that truth is not uppermost in his mind, and, unless it is, true progress is unlikely.

Indeed, in the 125 short pages of the book, Hart demonstrates the above to be true, with his particular historical focus on accuracy, truth, and freedom, explaining the intertwined nature of the three. A society that squashes freedom of thought and opinion is one that typically distorts truth, and for that reason, Hart was a supporter of free democracy, with all of its problems in full force:

We learn from history that democracy has commonly put a premium on conventionality. By its nature, it prefers those who keep step with the slowest march of thought and frowns on those who may disturb the “conspiracy for mutual inefficiency.” Thereby, this system of government tends to result in the triumph of mediocrity—and entails the exclusion of first-rate ability, if this is combined with honesty. But the alternative to it, despotism, almost inevitably means the triumph of stupidity. And of the two evils, the former is the less. Hence it is better that ability should consent to its own sacrifice, and subordination to the regime of mediocrity, rather than assist in establishing a regime where, in the light of past experience, brute stupidity will be enthroned and ability may preserve its footing only at the price of dishonesty.

Hart’s clear-eyed view of the world as an examiner of human nature and the repetition of folly led him to conclude that even if authoritarianism and coercion were occasionally drivers of efficiency in the short-run, by the quick and determined decision-making of a dictator, that in the long-term this would always cause stagnation. Calling to mind Karl Popper, Hart recognizes that freedom of thought and the resulting spread of ideas is the real engine of human progress over time, and that should never be squashed:

Only second to the futility of pursuing ends reckless of the means is that of attempting progress by compulsion. History shows how often it leads to reaction. It also shows that the surer way is to generate and diffuse the idea of progress—providing a light to guide men, not a whip to drive them. Influence on thought has been the most influential factor in history, though, being less obvious than the effects of action, it has received less attention— even from the writers of history. There is a general recognition that man’s capacity for thought has been responsible for all human progress, but not yet an adequate appreciation of the historical effect of contributions to thought in comparison with that of spectacular action. Seen with a sense of proportion, the smallest permanent enlargement of men’s thought is a greater achievement, and ambition, than the construction of something material that crumbles, the conquest of a kingdom that collapses, or the leadership of a movement that ends in a rebound.

Once the collective importance of each individual in helping or hindering progress is appreciated, the experience contained in history is seen to have a personal, not merely a political, significance. What can the individual learn from history—as a guide to living? Not what to do but what to strive for. And what to avoid in striving. The importance and intrinsic value of behaving decently. The importance of seeing clearly—not least of seeing himself clearly.

Hart’s final statement there calls to mind Richard Feynman: “The first principle is you must not fool yourself, and you are the easiest person to fool.”

Finally, Hart admits that the path of studying history and studying truth is not an easy one. Truth is frequently cloaked, and it takes work to peel away the layers. But if we are to see things clearly, and we must do so if we’re to have a peaceful world, we must persevere in the hunt:

It is strange how people assume that no training is needed in the pursuit of truth. It is stranger still that this assumption is often manifest in the very man who talks of the difficulty of determining what is true. We should recognize that for this pursuit anyone requires at least as much care and training as a boxer for a fight or a runner for a marathon. He has to learn how to detach his thinking from every desire and interest, from every sympathy and antipathy—like ridding oneself of superfluous tissue, the “tissue” of untruth which all human beings tend to accumulate for their own comfort and protection. And he must keep fit, to become fitter. In other words, he must be true to the light he has seen.

Still Interested? Check out the short book in its entirety.