• Skip to main content
  • Skip to header right navigation
  • Skip to site footer
Farnam Street Logo

Farnam Street

Mastering the best of what other people have already figured out

  • Articles
  • Newsletter
  • Podcast
  • Books
  • Courses
  • Log In
  • Become a Member
TweetEmailLinkedInPrint
Mental Models|Reading Time: 13 minutes

How Filter Bubbles Distort Reality: Everything You Need to Know

We find ourselves in a filter bubble any time we’re only surrounded by views and opinions we agree with, while being sheltered from opposing perspectives. Filter bubbles distort our understanding of the world and hamper our ability to make balanced decisions. Here’s how to pop the bubble.

***

The Basics

Read the headline, tap, scroll, tap, tap, scroll.

It is a typical day and you are browsing your usual news site. The New Yorker, BuzzFeed, The New York Times, BBC, The Globe and Mail, take your pick. As you skim through articles, you share the best ones with like-minded friends and followers. Perhaps you add a comment.

Few of us sit down and decide to inform ourselves on a particular topic. For the most part, we pick up our smartphones or open a new tab, scroll through a favored site and click on whatever looks interesting. Or we look at Facebook or Twitter feeds to see what people are sharing. Chances are high that we are not doing this intending to become educated on a certain topic. No, we are probably waiting in line, reading on the bus or at the gym, procrastinating, or grappling with insomnia, looking for some form of entertainment.

We all do this skimming and sharing and clicking, and it seems so innocent. But many of us are uninformed about or uninterested in the forces affecting what we see online and how content affects us in return — and that ignorance has consequences.

The term “filter bubble” refers to the results of the algorithms that dictate what we encounter online. According to Eli Pariser, those algorithms create “a unique universe of information for each of us … which fundamentally alters the way we encounter ideas and information.”

Many sites offer personalized content selections, based on our browsing history, age, gender, location, and other data. The result is a flood of articles and posts that support our current opinions and perspectives to ensure that we enjoy what we see. Even when a site is not offering specifically targeted content, we all tend to follow people whose views align with ours. When those people share a piece of content, we can be sure it will be something we are also interested in.

That might not sound so bad, but filter bubbles create echo chambers. We assume that everyone thinks like us, and we forget that other perspectives exist.

Filter bubbles transcend web surfing. In important ways, your social circle is a filter bubble; so is your neighborhood. If you’re living in a gated community, for example, you might think that reality is only BMWs, Teslas, and Mercedes. Your work circle acts as a filter bubble, too, depending on whom you know and at what level you operate.

One of the great problems with filters is our human tendency to think that what we see is all there is, without realizing that what we see is being filtered.

Eli Pariser on Filter Bubbles

The concept of filter bubbles was first identified by Eli Pariser, executive of Upworthy, activist, and author. In his revolutionary book Filter Bubbles, Pariser explained how Google searches bring up vastly differing results depending on the history of the user. He cites an example in which two people searched for “BP” (British Petroleum). One user saw news related to investing in the company. The other user received information about a recent oil spill.

Pariser describes how the internet tends to give us what we want:

Your computer monitor is a kind of one-way mirror, reflecting your own interests while algorithmic observers watch what you click.

Pariser terms this reflection a filter bubble, a “personal ecosystem of information.” It insulates us from any sort of cognitive dissonance by limiting what we see. At the same time, virtually everything we do online is being monitored — for someone else’s benefit.

Each time we click, watch, share, or comment, search engines and social platforms harvest information. In particular, this information serves to generate targeted advertisements. Most of us have experienced the odd sensation of deja vu as a product we took a look at online suddenly appears everywhere we go online, as well as in our email inboxes. Often this advertising continues until we succumb and purchase the product.

Targeted advertisements can help us to find what we need with ease, but costs exist:

Personalization is based on a bargain. In exchange for the service of filtering, you hand large companies an enormous amount of data about your daily life — much of which you might not trust your friends with.

The internet has changed a great deal from the early days, when people worried about strangers finding out who they were. Anonymity was once king. Now, our privacy has been sacrificed for the sake of advertising revenue:

What was once an anonymous medium where anyone could be anyone—where, in the words of the famous New Yorker cartoon, nobody knows you’re a dog—is now a tool for soliciting and analyzing our personal data. According to one Wall Street Journal study, the top fifty Internet sites, from CNN to Yahoo to MSN, install an average of 64 data-laden cookies and personal tracking beacons each. Search for a word like “depression” on Dictionary. com, and the site installs up to 223 tracking cookies and beacons on your computer so that other Web sites can target you with antidepressants. Share an article about cooking on ABC News, and you may be chased around the Web by ads for Teflon-coated pots. Open—even for an instant—a page listing signs that your spouse may be cheating and prepare to be haunted with DNA paternity-test ads. The new Internet doesn’t just know you’re a dog; it knows your breed and wants to sell you a bowl of premium kibble.

The sources of this information can be unexpected. Companies gather it from places we might not even consider:

When you read books on your Kindle, the data about which phrases you highlight, which pages you turn, and whether you read straight through or skip around are all fed back into Amazon’s servers and can be used to indicate what books you might like next. When you log in after a day reading Kindle e-books at the beach, Amazon can subtly customize its site to appeal to what you’ve read: If you’ve spent a lot of time with the latest James Patterson, but only glanced at that new diet guide, you might see more commercial thrillers and fewer health books.

One fact is certain. The personalization process is not crude or random. It operates along defined guidelines which are being refined every day. Honing occurs both on the whole and for individuals:

Most personalized filters are based on a three-step model. First, you figure out who people are and what they like. Then, you provide them with content and services that best fit them. Finally, you tune to get the fit just right. Your identity shapes your media. There’s just one flaw in this logic: Media also shape identity. And as a result, these services may end up creating a good fit between you and your media by changing … you.

In The Shallows, Nicholas Carr also covers online information collection. Carr notes that the more time we spend online, the richer the information we provide:

The faster we surf across the surface of the Web—the more links we click and pages we view—the more opportunities Google gains to collect information about us and to feed us advertisements. Its advertising system, moreover, is explicitly designed to figure out which messages are most likely to grab our attention and then to place those messages in our field of view. Every click we make on the Web marks a break in our concentration, a bottom-up disruption of our attention—and it’s in Google’s economic interest to make sure we click as often as possible.

Every single person who has ever spent time on the web knows how addictive the flow of stimulating information can be. No matter how disciplined we otherwise are, we cannot resist clicking related articles or scrolling through newsfeeds. There is a reason for this, as Pariser writes:

Personalized filters play to the most compulsive parts of you, creating “compulsive media” to get you to click things more.

In an attention economy, filter bubbles assist search engines, websites, and platforms in their goal to command the maximum possible share of our online time.

The Impact of Filter Bubbles

Each new technology brings with it a whole host of costs and benefits. Many are realized only as time passes. The invention of books led people to worry that memory and oral tradition would erode. Paper caused panic as young people switched from slates to this newfangled medium. Typewriters led to discussions of morality as female typists entered the job force and “distracted” men. The internet has been no exception. If anything, the issues presented by it are unique only in their complex intensity.

In particular, the existence of filter bubbles has led to widespread concern. Pariser writes:

Democracy requires citizens to see things from one another’s point of view, but instead we’re more and more enclosed in our own bubbles. Democracy requires a reliance on shared facts; instead we’re being offered parallel but separate universes.

… Personalization filters serve a kind of invisible autopropaganda, indoctrinating us with our own ideas, amplifying our desire for things that are familiar and leaving us oblivious to the dangers lurking in the dark territory of the unknown.

Pariser quotes Jon Chait as saying:

Partisans are more likely to consume news sources that confirm their ideological beliefs. People with more education are more likely to follow political news. Therefore, people with more education can actually become mis-educated.

Many people have debated the impact of filter bubbles on the recent US election and the Brexit vote. In both cases, large numbers of people were shocked by the outcome. Even those within the political and journalistic worlds expected the inverse results.

“We become, neurologically, what we think.”

— Nicholas Carr

In the case of the Brexit vote, a large percentage of those who voted to leave the European Union were older people who are less active online, meaning that their views are less visible. Those who voted to remain tended to be younger and more active online, meaning that they were in an echo chamber of similar attitudes.

Democracy requires everyone to be equally informed. Yet filter bubbles are distorting our ideas of the world. In a paper for Princeton University, Jacob N. Shapiro revealed the extent of the influence on our voting:

The results of these experiments demonstrate that (i) biased search rankings can shift the voting preferences of undecided voters by 20% or more, (ii) the shift can be much higher in some demographic groups, and (iii) search ranking bias can be masked so that people show no awareness of the manipulation. We call this type of influence, which might be applicable to a variety of attitudes and beliefs, the search engine manipulation effect. Given that many elections are won by small margins, our results suggest that a search engine company has the power to influence the results of a substantial number of elections with impunity. The impact of such manipulations would be especially large in countries dominated by a single search engine company.

Filter bubbles do not just occur on the internet. Shapiro provides an example from a decade ago of TV shifting the results of elections:

It is already well established that biased media sources such as newspapers, political polls, and television sway voters. A 2007 study by DellaVigna and Kaplan found, for example, that whenever the conservative-leaning Fox television network moved into a new market in the United States, conservative votes increased, a phenomenon they labeled the Fox News Effect. These researchers estimated that biased coverage by Fox News was sufficient to shift 10,757 votes in Florida during the 2000 US Presidential election: more than enough to flip the deciding state in the election, which was carried by the Republican presidential candidate by only 537 votes. The Fox News Effect was also found to be smaller in television markets that were more competitive.

However, Shapiro believes the internet has a more dramatic effect than other forms of media:

Search rankings are controlled in most countries today by a single company. If, with or without intervention by company employees, the algorithm that ranked election-related information favored one candidate over another, competing candidates would have no way of compensating for the bias. It would be as if Fox News were the only television network in the country. Biased search rankings would, in effect, be an entirely new type of social influence, and it would be occurring on an unprecedented scale. Massive experiments conducted recently by social media giant Facebook have already introduced other unprecedented types of influence made possible by the Internet. Notably, an experiment reported recently suggested that flashing “VOTE” advertisements to 61 million Facebook users caused more than 340,000 people to vote that day who otherwise would not have done so.

In the US election and the Brexit vote, filter bubbles caused people to become insulated from alternative views. Some critics have theorized that the widespread derision of Trump and Leave voters led them to be less vocal, keeping their opinions within smaller communities to avoid confrontation. Those who voted for Clinton or to Remain loudly expressed themselves within filtered communities. Everyone, it seemed, agreed with each other. Except, they didn’t, and no one noticed until it was too late.

A further issue with filter bubbles is that they are something we can only opt out of, not something we consent to. As of March 2017, an estimated 1.94 billion people have a Facebook account, of which 1.28 billion log on every day. It is safe to assume that only a small percentage are informed about the algorithms. Considering that 40% of people regard Facebook as their main news source, this is worrying. As with cognitive biases, a lack of awareness amplifies the impact of filter bubbles.

We have minimal concrete evidence of exactly what information search engines and social platforms collect. Even SEO (search engine optimization) experts do not know for certain how search rankings are organized. We also don’t know if sites collect information from users who do not have accounts.

Scandals are becoming increasingly common, as sites and services are found to be harvesting details without consent. For example, Evernote came under fire when documents revealed that staff members can access documents, and Unroll’s nasty habit of selling details of user email habits led to criticism. Even when this information is listed in user agreements or disclaimers, it can be difficult for users to ascertain from the confusing jargon how their data are being used, by whom, and why.

In his farewell speech, President Obama aired his personal concerns:

[We] retreat into our own bubbles, … especially our social media feeds, surrounded by people who look like us and share the same political outlook and never challenge our assumptions. … And increasingly, we become so secure in our bubbles that we start accepting only information, whether it’s true or not, that fits our opinions, instead of basing our opinions on the evidence that is out there.

Filter bubbles can cause cognitive biases and shortcuts to manifest, amplifying their negative impact on our ability to think in a logical and critical manner. A combination of social proof, availability bias, confirmation bias, and bias from disliking/liking is prevalent. As Pariser writes:

The filter bubble tends to dramatically amplify confirmation bias—in a way, it’s designed to. Consuming information that conforms to our ideas of the world is easy and pleasurable; consuming information that challenges us to think in new ways or question our assumptions is frustrating and difficult. This is why partisans of one political stripe tend not to consume the media of another. As a result, an information environment built on click signals will favor content that supports our existing notions about the world over content that challenges them.

Pariser sums up the result of extensive filtration: “A world constructed from the familiar is the world in which there’s nothing to learn.”

Filter Bubbles and Group Psychology

We have an inherent desire to be around those who are like us and reinforce our worldview. Our online behavior is no different. People form tribes based on interests, location, employment, affiliation, and other details. These groups — subreddits, Tumblr fandoms, Facebook groups, Google+ circles, etc. — have their own rules, conventions, in-jokes, and even vocabulary. Within groups (even if members never meet each other), beliefs intensify. Anyone who disagrees may be ousted from the community. Sociologists call this behaviour “communal reinforcement” and stress that the ideas perpetuated can have no relation to reality or empirical evidence.

“When you’re asked to fight a war that’s over nothing, it’s best to join the side that’s going to win.”

— Conor Oberst

Communal reinforcement can be positive. Groups geared towards people with mental health problems, chronic illnesses, addictions, and other issues are often supportive and assist many people who might not have another outlet.

However, when a group is encased within a filter bubble, it can lead to groupthink. This is a psychological phenomenon wherein groups of people experience a temporary loss of the ability to think in a rational, moral and realistic manner. When the members of a group are all exposed to the same confirmatory information, the results can be extreme. Symptoms include being excessively optimistic, taking risks, ignoring legal and social conventions, regarding those outside the group as enemies, censoring opposing ideas, and pressuring members to conform. As occurred with the US election and the Brexit vote, those experiencing groupthink within a filter bubble see themselves as in the right and struggle to consider alternative perspectives.

For example, imagine a Facebook group for Trump supporters in the months prior to the election. Members share pro-Trump news items, discuss policies and circulate cohesive information among themselves. Groupthink sets in, as the members selectively process information, fail to evaluate alternative viewpoints, fail to consider risks, haze any members who disagree, and even ignore the possibility of a negative outcome. From the outside, we can see the issues with a combination of filter bubbles and groupthink, but they can be hard to identify from the inside.

How Can We Avoid Filter Bubbles?

Thankfully, it is not difficult to pop the filter bubble if we make an effort to do so. Methods for doing this include:

  • Using ad-blocking browser extensions. These remove the majority of advertisements from websites we visit. The downside is that most sites rely on advertising revenue to support their work, and some (such as Forbes and Business Insider) insist on users’ disabling ad blockers before viewing a page.
  • Reading news sites and blogs which aim to provide a wide range of perspectives. Pariser’s own site, Upworthy, aims to do this. Others, including The Wall Street Journal, the New Yorker, BBC, and AP news claim to offer a balanced view of the world. Regardless of the sources we frequent, a brief analysis of the front page will provide a good idea of any biases. In the wake of the US election, a number of newsletters, sites, apps, and podcasts are working to pop the filter bubble. An excellent example is Colin Wright’s podcast, Let’s Know Things (http://letsknowthings.com/), which examines a news story in context each week.
  • Switching our focus from entertainment to education. As Nicholas Carr writes in The Shallows: “The Net’s interactivity gives us powerful new tools for finding information, expressing ourselves, and conversing with others. It also turns us into lab rats constantly pressing levers to get tiny pellets of social or intellectual nourishment.”
  • Using Incognito browsing, deleting our search histories, and doing what we need to do online without logging into our accounts.
  • Deleting or blocking browser cookies. For the uninitiated, many websites plant “cookies” (small text files) each time we visit them; those cookies are then used to determine what content to show us. Cookies can be manually deleted, and browser extensions are available which remove them. In some instances, cookies are useful, so removal should be done with discretion.

Fish don’t know they are in the water and we don’t know we are in a filter bubble unless we take the effort to (as David Bowie put it) leave the capsule — if you dare.

In shaping what we see, filter bubbles show us a distorted map and not the terrain. In so doing, they trick our brains into thinking that this is the reality. As technology improves and the ability of someone like the NYT, say, to show the same story to 100 different people using 100 different ways, the filter bubble becomes deeper. We lose track of what’s filtered and what’s not as the news becomes tailored to cement our existing opinions. After all, everyone wants to read a newspaper that agrees with them.

Systems — be they people, cultures, or web browsing, to name a few examples — naturally have to filter information and thus they reduce options. Sometimes people make decisions, sometimes cultures make them, and increasingly algorithms make them. As the speed of information flowing through these systems increases, filters will play an even more important role.

Understanding that what we see is not all there is will help us realize that we’re living in a distorted world and remind us to take off the glasses.

For more information on filter bubbles, consider reading Filter Bubbles by Eli Pariser, So, You’ve Been Publicly Shamed by Jon Ronson, The Shallows by Nicholas Carr or The Net Delusion by Evgeny Morozov.

Read Next

Next Post:The Difference Between Amateurs and ProfessionalsWhy is it that some people seem to be hugely successful and do so much, while the vast majority of us struggle to tread water? The answer is …

Discover What You’re Missing

Get the weekly email full of actionable ideas and insights you can use at work and home.


As seen on:

Forbes logo
New York Times logo
Wall Street Journal logo
The Economist logo
Financial Times logo
Farnam Street Logo

© 2023 Farnam Street Media Inc. All Rights Reserved.
Proudly powered by WordPress. Hosted by Pressable. See our Privacy Policy.

  • Speaking
  • Sponsorship
  • About
  • Support
  • Education

We’re Syrus Partners.
We buy amazing businesses.


Farnam Street participates in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising commissions by linking to Amazon.