• Skip to main content
  • Skip to header right navigation
  • Skip to site footer
Farnam Street Logo

Farnam Street

Mastering the best of what other people have already figured out

  • Articles
  • Newsletter
  • Podcast
  • Books
  • Courses
  • Log In
  • Become a Member
TweetEmailLinkedInPrint
Culture|Reading Time: 2 minutes

The Filter Bubble — What the Internet is Hiding From You

In case you’re interested, we have another article on The Filter Bubble here.

Just “googling it” might not be such a great idea after all

The Filter Bubble, by Eli Pariser, puts forth an argument that we’re increasingly trapped inside an algorithm that filters our news based on what it thinks is best for us. In this case, IT is an algorithm.

Computers and the algorithms they run are increasingly aware of the things we seem to like. They learn from what we click on and tailor results so we get more of what we like and less of what we don’t like. The equivalent would be parents giving their kids only sugar because that’s what they seem to like. Only parents know better, so they feed their kids what they need first with a sprinkling of what they want. Algorithms don’t. This means that two people googling the same thing are likely to see different results.

The problem with this, Pariser argues, is that you’re not making a conscious choice to have your results filtered — it happens without your knowledge or consent. And that causes a whole host of issues, of which Pariser is primarily concerned with the social and political implications.

When technology’s job is to show you the world, it ends up sitting between you and reality, like a camera lens.

“If we want to know what the world really looks like,” Pariser writes, “we have to understand how filters shape and skew our view of it.” It’s useful to borrow from nutrition to illustrate this point:

Our bodies are programmed to consume fat and sugars because they’re rare in nature. Thus, when they come around, we should grab them. In the same way, we’re biologically programmed to be attentive to things that stimulate: content that is gross, violent, or sexual and that gossip which is humiliating, embarrassing, or offensive. If we’re not careful, we’re going to develop the psychological equivalent of obesity. We’ll find ourselves consuming content that is least beneficial for ourselves or society as a whole.

Consider for a moment where we are headed. If Google knows that I’m a democrat or republican they could now add a filter to my news to show me only the stories I’m predisposed to agree with. Based on their guess as to my education level, they may then tailor the article’s words and language to maximize its impact on me. In this world, I only see things I agree with and writing that I easily comprehend and that’s a problem. Google might know that I don’t read anything about Republican tax cuts or democratic spending so they might just filter those articles out. Reality, as I see it, becomes what the lens shows me.

“If you’re not paying for something, you’re not the customer; you’re the product being sold.”

— Andrew Lewis

When asked about the prospects for important but unpopular news, Media Lab’s Nicholas Negroponte smiled. On one end of the spectrum, he said is sycophantic personalization — “you’re so great and wonderful, and I’m going to tell you exactly what you want to hear.” On the other end is the parental approach: “I’m going to tell you this whether you want to hear this or not, because you need to know.” Currently, he argues, we’re headed in sycophantic direction.

Whether you believe the book’s conclusions are wholly convincing or not, it is worth thinking about. If nothing else, it is thought-provoking.

(If you want a search engine that won’t “track you” try DuckDuckGo.com)

Read Next

Next Post:5 Things Cicero Can Teach You About Winning An ElectionIn 64 B.C Marcus Tullius Cicero was running for the post of Roman consul. Cicero, a political outsider, was a brilliant man and gifted …

Discover What You’re Missing

Get the weekly email full of actionable ideas and insights you can use at work and home.


As seen on:

Forbes logo
New York Times logo
Wall Street Journal logo
The Economist logo
Financial Times logo
Farnam Street Logo

© 2023 Farnam Street Media Inc. All Rights Reserved.
Proudly powered by WordPress. Hosted by Pressable. See our Privacy Policy.

  • Speaking
  • Sponsorship
  • About
  • Support
  • Education

We’re Syrus Partners.
We buy amazing businesses.


Farnam Street participates in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising commissions by linking to Amazon.