Skip to main content

Forget Facebook and Google, burst your own filter bubble

fake news and filter bubbles fakenews filterbubbles 05
Eli Pariser, Co-founder of Avaaz Kris Krüg/Flickr
It’s been half a decade since the co-founder of Avaaz, Eli Pariser, first coined the phrase “filter bubble,” but his prophetic TED Talk — and his concerns and warnings — are even more applicable now than they were then. In an era of fake news, curated content, personalized experiences, and deep ideological divisions, it’s time we all take responsibility for bursting our own filter bubbles.

When I search for something on Google, the results I see are quite different from yours, based on our individual search histories and whatever other data Google has collected over the years. We see this all the time on our Facebook timelines, as the social network uses its vats of data to offer us what it thinks we want to see and hear. This is your bubble.

Numerous companies have been striving toward bubbles for years. Facebook founder and CEO Mark Zuckerberg is believed to have once told colleagues that “a squirrel dying in your front yard may be more relevant to your interests right now than people dying in Africa.” The entirety of Facebook is geared toward making sure you know everything there is to know about that squirrel.

Although contentious, it’s arguable that Zuckerberg is at least partly right. People couldn’t function in their day-to-day lives if they spent every second worrying about the problems of the world. But curating our news to give us what we want to see, rather than what we perhaps need to see, has real, long-term problems.

The dangers of filter bubbles

Filter bubbles may not seem too threatening a prospect, but they can lead to two distinct but connected issues. The first is that when you only see things you agree with, it can lead to a snowballing confirmation bias that builds up steadily over time.

They don’t overtly take a stance, they invisibly paint the digital landscape with things that are likely to align with your point of view.

A wider problem is that with such difference sources of information between people, it can lead to the generation of a real disconnect, as they become unable to understand how anyone could think differently from themselves.

A look at any of the left- or right-leaning mainstream TV stations during the buildup to the recent election would have left you in no doubt over which candidate they backed. The same can be said of newspapers and other media. In fact, this is true of many published endorsements.

But we’re all aware of that bias. It’s easy to simply switch off or switch over to another station, to see the other side of the coin.

Online, the bias is more covert. Google searches, social network feeds, and even some news publications all curate what they show you. Worse, it’s all behind the scenes. They don’t overtly take a stance, they invisibly paint the digital landscape with things that are likely to align with your point of view.

If a person’s Facebook feed is full of pro-Hillary and anti-Trump stories and posts, you may wonder how on Earth anyone could vote for the man. If your feed is the complete opposite, highlighting only the negatives of Hillary and championing Trump and his benefits, you may have the exact opposite opinion.

Like Wittgenstein’s Lion, if our frames of reference from news and social feeds are so different from one another, could we ever hope to understand each other’s position?

Fake news, a historic problem, persists today

This becomes even more of a problem when you factor in faux news. This latest election was one of the most contentious in history, with low-approval candidates on both sides and salacious headlines thrown out by every source imaginable. With so much mud being slung, it was hard to keep track of what was going on, and that was doubly so online, where fake news was abundant.

This is something that Facebook CEO Mark Zuckerberg has tried to play down, claiming that it only accounted for 1 percent of the overall Facebook news. Considering Facebook has near 2 billion users, though, that’s potentially a lot of faux stories parroted as the truth. It’s proved enough of an issue that studies suggest many people have difficulty telling fake news from real news, and in the weeks since the election, both Google and Facebook have made pledges to deal with the problem.

Also consider that 61 percent of millennials use Facebook as their main source of news, and you can see how this issue could be set to worsen if it’s not stoppered soon. But this isn’t the first time the youth has been tricked by the right sort of lies.

Fake news, fake knowledge, and fake wisdom are something that humans have had difficulty with in perpetuity. Sophistry was once a practice of teaching rhetoric and public speaking in ancient Greece, but is thought to have been co-opted by charlatans who used the power of rhetoric and philosophy to not only make money from their paying students, but to popularize ridiculous arguments.

Plato described such a person in one of his later dialogues, and attempted to draw a comparison between them and their brand of implied wisdom, versus a true philosopher or statesman. In it, he concludes that sophistry is the near indistinguishable imitation of a true art, much as fake news today imitates the art form of journalistic investigation and reporting.

The second president of the United States, John Adams, knew its dangers too. In response to a letter from a friend in 1819 inquiring about the definition of certain words like “liberty and “republic,” he praised the search for such clarity, highlighting the importance of being acutely aware of the meaning behind words and phrases.

“Abuse of words has been the great instrument of sophistry and chicanery, of party, faction, and division of society.”

“Abuse of words has been the great instrument of sophistry and chicanery, of party, faction, and division of society,” he said, before citing his own tiredness at the pursuit of such clarification.

In much the same way that sophists and fraudsters of the past could use the techniques of their peers to make money, raise their own stature, and in some ways subvert the functioning society, fake news sites and authors use the styles and techniques of online journalism to create content that seems plausible. When combined with a salacious headline, and the ability to easily share that content online before checking its authenticity, you have a recipe for the proliferation of phony stories that can have a real cultural impact.

While Zuckerberg may not think fake news and memes made a difference to the election, Facebook employee and Oculus VR founder Palmer Luckey certainly did. He was outed earlier this year for investing more than $100,000 in a company that helped promote Donald Trump online through the proliferation of memes and inflammatory attack advertisements. He wouldn’t have put in the effort if he thought it worthless.

Stories drive emotions

Buzzfeed’s analysis of the popular shared stories on Facebook shows that while fake news underperformed compared to its real counterparts in early 2016, by the time the Election Day rolled around at the start of November, it had a 1.5 million engagement lead over true stories.

That same analysis piece highlighted some of the biggest fake election stories, and all of them contained classic click-baiting tactics. They used scandalous wording, capitalization, and sensationalist claims to draw in the clickers, sharers, and commenters.

That’s because these sorts of words help to draw an emotional reaction from us. Marketing firm Co-Schedule discovered this back in 2014, but it’s likely something that many people would agree with even without the hard numbers. We’ve all been tempted by clickbait headlines before, and they’re usually ones that appeal to fear, anger, arousal, or some other part of us that isn’t related to critical thinking and political analysis. Everyone’s slinging mud from within their own filter bubbles, secure in the knowledge that they are right, and that everyone who thinks differently is an idiot.

Bursting what you cannot see

And therein lies the difficulty. The only way to really understand why someone may hold a different viewpoint is through empathy. But how can you empathize when you don’t have control over how the world appears to you, and your filter serves as a buffer to stories that might help you connect with the other side?

Reaching out to us from the past, Pariser  has some thoughts for those of us now living through his warning of the future. Even if Facebook may be stripping all humanity from its news curation, there are still human minds and fingertips behind the algorithms that feed us content. He called on those programmers to instill a sense of journalistic integrity in the AI behind the scenes.

facebookComp_head
Image used with permission by copyright holder

“We need the gatekeepers [of information] to encode [journalistic] responsibility into the code that they’re writing. […] We need to make sure that these algorithms have encoded in them a sense of the public life, a sense of civic responsibility. They need to be transparent enough that we can see what the rules are and […] we need [to be] given some control.”

That sort of suggestion seems particularly pertinent, since it was only at the end of August that Facebook laid off its entire editorial team, relying instead on automated algorithms to curate content. They didn’t do a great job, though, as weeks later they were found to have let a bevy of faux content through the screening process.

While it may seem like a tall order for megacorporations to push for such an open platform, so much of a stink has been raised about fake news in the wake of the election that it does seem like Facebook and Google at least will be doing something to target that problematic aspect of social networking. They can do more, though, and it could start with helping to raise awareness of the differences in the content we’re shown.

Certainly there are times we don’t need content catered to us. If you are researching a topic to write on, you want the raw data, not Google’s beautified version of it. When it comes to news, offering some manual control over the curation wouldn’t go amiss. either.

How about a button that lets us see the complete opposite to what our data-driven, personalized feeds show? I’d certainly click that now and again.

Beware online "filter bubbles" | Eli Pariser

But that puts the onus on other people to make the change for us, and it’s important to remember that the reason these services feed us content that’s relatively narrow is because of our own searches and clicks. If we all made a point to read outside of our comfort zone, to go in with a clear mind and demand content outside of our own bubble, we would get it and the algorithms would gradually respond.

That has the double benefit of giving us an immediate access to new information, but also teaches our digital curators to be a little more open-minded themselves.

And perhaps us too. At least enough to listen without shouting down and demanding a safe space for our own thoughts. Whether you believe that the opposing viewpoint is misguided, wrong, or disgusting, the best way to combat it is with reasonable debate. No terrible idea can survive the harsh light of day and intelligent opposition.

For his side of things, Pariser continues to highlight the problems filter bubbles pose, but has taken it upon himself to bring together people to help fight fake news and other online nonsense. If you’d like to help him out, you can contribute yourself.

It seems increasingly clear, though, that as much as there are many large institutions that need to make changes to help strive for truth online, the best step we can all take is to burst our own bubbles to see what’s beyond. It just might make thing a little clearer in a time where it’s increasingly hard to keep on top of what’s what.

Editors' Recommendations

Jon Martindale
Jon Martindale is the Evergreen Coordinator for Computing, overseeing a team of writers addressing all the latest how to…
How to create multiple profiles on a Facebook account
A series of social media app icons on a colorful smartphone screen.

Facebook (and, by extension, Meta) are particular in the way that they allow users to create accounts and interact with their platform. Being the opposite of the typical anonymous service, Facebook sticks to the rule of one account per one person. However, Facebook allows its users to create multiple profiles that are all linked to one main Facebook account.

In much the same way as Japanese philosophy tells us we have three faces — one to show the world, one to show family, and one to show no one but ourselves — these profiles allow us to put a different 'face' out to different aspects or hobbies. One profile can keep tabs on your friends, while another goes hardcore into networking and selling tech on Facebook Marketplace.

Read more
How to set your Facebook Feed to show most recent posts
A smartphone with the Facebook app icon on it all on a white marble background.

Facebook's Feed is designed to recommend content you'd most likely want to see, and it's based on your Facebook activity, your connections, and the level of engagement a given post receives.

But sometimes you just want to see the latest Facebook posts. If that's you, it's important to know that you're not just stuck with Facebook's Feed algorithm. Sorting your Facebook Feed to show the most recent posts is a simple process:

Read more
How to go live on TikTok (and can you with under 1,000 followers?)
Tik Tok

It only takes a few steps to go live on TikTok and broadcast yourself to the world:

Touch the + button at the bottom of the screen.
Press the Live option under the record button.
Come up with a title for your live stream. 
Click Go Live to begin.

Read more