Forget Facebook and Google, burst your own filter bubble

fake news and filter bubbles fakenews filterbubbles 05
Eli Pariser, Co-founder of Avaaz Kris Krüg/Flickr

It’s been half a decade since the co-founder of Avaaz, Eli Pariser, first coined the phrase “filter bubble,” but his prophetic TED Talk — and his concerns and warnings — are even more applicable now than they were then. In an era of fake news, curated content, personalized experiences, and deep ideological divisions, it’s time we all take responsibility for bursting our own filter bubbles.

When I search for something on Google, the results I see are quite different from yours, based on our individual search histories and whatever other data Google has collected over the years. We see this all the time on our Facebook timelines, as the social network uses its vats of data to offer us what it thinks we want to see and hear. This is your bubble.

Numerous companies have been striving toward bubbles for years. Facebook founder and CEO Mark Zuckerberg is believed to have once told colleagues that “a squirrel dying in your front yard may be more relevant to your interests right now than people dying in Africa.” The entirety of Facebook is geared toward making sure you know everything there is to know about that squirrel.

Although contentious, it’s arguable that Zuckerberg is at least partly right. People couldn’t function in their day-to-day lives if they spent every second worrying about the problems of the world. But curating our news to give us what we want to see, rather than what we perhaps need to see, has real, long-term problems.

The dangers of filter bubbles

Filter bubbles may not seem too threatening a prospect, but they can lead to two distinct but connected issues. The first is that when you only see things you agree with, it can lead to a snowballing confirmation bias that builds up steadily over time.

They don’t overtly take a stance, they invisibly paint the digital landscape with things that are likely to align with your point of view.

A wider problem is that with such difference sources of information between people, it can lead to the generation of a real disconnect, as they become unable to understand how anyone could think differently from themselves.

A look at any of the left- or right-leaning mainstream TV stations during the buildup to the recent election would have left you in no doubt over which candidate they backed. The same can be said of newspapers and other media. In fact, this is true of many published endorsements.

But we’re all aware of that bias. It’s easy to simply switch off or switch over to another station, to see the other side of the coin.

Online, the bias is more covert. Google searches, social network feeds, and even some news publications all curate what they show you. Worse, it’s all behind the scenes. They don’t overtly take a stance, they invisibly paint the digital landscape with things that are likely to align with your point of view.

If a person’s Facebook feed is full of pro-Hillary and anti-Trump stories and posts, you may wonder how on Earth anyone could vote for the man. If your feed is the complete opposite, highlighting only the negatives of Hillary and championing Trump and his benefits, you may have the exact opposite opinion.

Like Wittgenstein’s Lion, if our frames of reference from news and social feeds are so different from one another, could we ever hope to understand each other’s position?

Fake news, a historic problem, persists today

This becomes even more of a problem when you factor in faux news. This latest election was one of the most contentious in history, with low-approval candidates on both sides and salacious headlines thrown out by every source imaginable. With so much mud being slung, it was hard to keep track of what was going on, and that was doubly so online, where fake news was abundant.

This is something that Facebook CEO Mark Zuckerberg has tried to play down, claiming that it only accounted for 1 percent of the overall Facebook news. Considering Facebook has near 2 billion users, though, that’s potentially a lot of faux stories parroted as the truth. It’s proved enough of an issue that studies suggest many people have difficulty telling fake news from real news, and in the weeks since the election, both Google and Facebook have made pledges to deal with the problem.

Also consider that 61 percent of millennials use Facebook as their main source of news, and you can see how this issue could be set to worsen if it’s not stoppered soon. But this isn’t the first time the youth has been tricked by the right sort of lies.

Fake news, fake knowledge, and fake wisdom are something that humans have had difficulty with in perpetuity. Sophistry was once a practice of teaching rhetoric and public speaking in ancient Greece, but is thought to have been co-opted by charlatans who used the power of rhetoric and philosophy to not only make money from their paying students, but to popularize ridiculous arguments.

Plato described such a person in one of his later dialogues, and attempted to draw a comparison between them and their brand of implied wisdom, versus a true philosopher or statesman. In it, he concludes that sophistry is the near indistinguishable imitation of a true art, much as fake news today imitates the art form of journalistic investigation and reporting.

The second president of the United States, John Adams, knew its dangers too. In response to a letter from a friend in 1819 inquiring about the definition of certain words like “liberty and “republic,” he praised the search for such clarity, highlighting the importance of being acutely aware of the meaning behind words and phrases.

“Abuse of words has been the great instrument of sophistry and chicanery, of party, faction, and division of society.”

“Abuse of words has been the great instrument of sophistry and chicanery, of party, faction, and division of society,” he said, before citing his own tiredness at the pursuit of such clarification.

In much the same way that sophists and fraudsters of the past could use the techniques of their peers to make money, raise their own stature, and in some ways subvert the functioning society, fake news sites and authors use the styles and techniques of online journalism to create content that seems plausible. When combined with a salacious headline, and the ability to easily share that content online before checking its authenticity, you have a recipe for the proliferation of phony stories that can have a real cultural impact.

While Zuckerberg may not think fake news and memes made a difference to the election, Facebook employee and Oculus VR founder Palmer Luckey certainly did. He was outed earlier this year for investing more than $100,000 in a company that helped promote Donald Trump online through the proliferation of memes and inflammatory attack advertisements. He wouldn’t have put in the effort if he thought it worthless.

Stories drive emotions

Buzzfeed’s analysis of the popular shared stories on Facebook shows that while fake news underperformed compared to its real counterparts in early 2016, by the time the Election Day rolled around at the start of November, it had a 1.5 million engagement lead over true stories.

That same analysis piece highlighted some of the biggest fake election stories, and all of them contained classic click-baiting tactics. They used scandalous wording, capitalization, and sensationalist claims to draw in the clickers, sharers, and commenters.

That’s because these sorts of words help to draw an emotional reaction from us. Marketing firm Co-Schedule discovered this back in 2014, but it’s likely something that many people would agree with even without the hard numbers. We’ve all been tempted by clickbait headlines before, and they’re usually ones that appeal to fear, anger, arousal, or some other part of us that isn’t related to critical thinking and political analysis. Everyone’s slinging mud from within their own filter bubbles, secure in the knowledge that they are right, and that everyone who thinks differently is an idiot.

Bursting what you cannot see

And therein lies the difficulty. The only way to really understand why someone may hold a different viewpoint is through empathy. But how can you empathize when you don’t have control over how the world appears to you, and your filter serves as a buffer to stories that might help you connect with the other side?

Reaching out to us from the past, Pariser  has some thoughts for those of us now living through his warning of the future. Even if Facebook may be stripping all humanity from its news curation, there are still human minds and fingertips behind the algorithms that feed us content. He called on those programmers to instill a sense of journalistic integrity in the AI behind the scenes.

facebookComp_head

“We need the gatekeepers [of information] to encode [journalistic] responsibility into the code that they’re writing. […] We need to make sure that these algorithms have encoded in them a sense of the public life, a sense of civic responsibility. They need to be transparent enough that we can see what the rules are and […] we need [to be] given some control.”

That sort of suggestion seems particularly pertinent, since it was only at the end of August that Facebook laid off its entire editorial team, relying instead on automated algorithms to curate content. They didn’t do a great job, though, as weeks later they were found to have let a bevy of faux content through the screening process.

While it may seem like a tall order for megacorporations to push for such an open platform, so much of a stink has been raised about fake news in the wake of the election that it does seem like Facebook and Google at least will be doing something to target that problematic aspect of social networking. They can do more, though, and it could start with helping to raise awareness of the differences in the content we’re shown.

Certainly there are times we don’t need content catered to us. If you are researching a topic to write on, you want the raw data, not Google’s beautified version of it. When it comes to news, offering some manual control over the curation wouldn’t go amiss. either.

How about a button that lets us see the complete opposite to what our data-driven, personalized feeds show? I’d certainly click that now and again.

But that puts the onus on other people to make the change for us, and it’s important to remember that the reason these services feed us content that’s relatively narrow is because of our own searches and clicks. If we all made a point to read outside of our comfort zone, to go in with a clear mind and demand content outside of our own bubble, we would get it and the algorithms would gradually respond.

That has the double benefit of giving us an immediate access to new information, but also teaches our digital curators to be a little more open-minded themselves.

And perhaps us too. At least enough to listen without shouting down and demanding a safe space for our own thoughts. Whether you believe that the opposing viewpoint is misguided, wrong, or disgusting, the best way to combat it is with reasonable debate. No terrible idea can survive the harsh light of day and intelligent opposition.

For his side of things, Pariser continues to highlight the problems filter bubbles pose, but has taken it upon himself to bring together people to help fight fake news and other online nonsense. If you’d like to help him out, you can contribute yourself.

It seems increasingly clear, though, that as much as there are many large institutions that need to make changes to help strive for truth online, the best step we can all take is to burst our own bubbles to see what’s beyond. It just might make thing a little clearer in a time where it’s increasingly hard to keep on top of what’s what.

Mobile

Broadway actor tells Kanye West to get off his phone during opening night

Theater actors can get understandably upset when they spot someone in the audience fiddling with their phone instead of watching the show. The other night that audience member was Kanye West, and he got called out for it.
Gaming

How to make a clan in ‘Destiny 2’ and where to go from there

Want to know how to make a clan in 'Destiny 2'? Here's everything you need to know, including how to design your clan's banner, earn extra loot, and help other players with guided games.
Emerging Tech

Feast your eyes on the wildest, most elaborate Rube Goldberg machines ever built

Want to see something totally mesmerizing? Check out several of the best Rube Goldberg machines from across the internet, including one that serves cake and other that do ... nothing particularly useful.
Social Media

You can now share saved Facebook posts with a Pinterest-like collection tool

Facebook collections can now be shared with friends if you also want to allow them to contribute to the list. Facebook is rolling out an update that allows users to add a contributor to their collections, or lists of saved Facebook posts.
Social Media

This event topped Facebook’s biggest moments of the year — again

As the year comes to a close, Facebook is looking back on what users discussed most over the last year. For two years in a row, International Women's Day topped the list. So what else is on the list?
Social Media

This band owns Twitter, according to list of top accounts and tweets for 2018

What was the biggest buzz on Twitter in 2018? Twitter's 2018 Year in Review highlights the biggest tweets, accounts, and hashtags. The most-tweeted celebrities, movies, TV shows, athletes, politicians and more in Twitter's 2018 trends.
Social Media

What do yodeling and Kylie Jenner have in common? YouTube’s top 2018 videos

In a true nod to the variety found on YouTube, the platform's top 10 list of videos from 2018 range from celebrities to sports, from perfectly tossing a picture frame on the wall to a kid yodeling in aisle 12 at Walmart.
Home Theater

It took Tom Cruise to raise awareness of this troublesome TV setting

Tom Cruise, in an unexpected PSA tweet, asks you to turn off motion interpolation on your TV, but stops short of how to do it. Here's more on the topic, along with links to a guide on how to rid your TV of the dreaded "soap opera effect."
Computing

Make a GIF of your favorite YouTube video with these great tools

Making a GIF from a YouTube video is easier today than ever, but choosing the right tool for the job isn't always so simple. In this guide, we'll teach you how to make a GIF from a YouTube video with our two favorite online tools.
Business

Amazon scouted airport locations for its cashier-free Amazon Go stores

Representatives of Amazon Go checkout-free retail stores connected with officials at Los Angeles and San Jose airports in June to discuss the possibility of cashier-free grab-and-go locations in busy terminals.
Social Media

Snapchat facial recognition could soon power a new portrait mode, code suggests

Digging into Snapchat's code suggests a handful of upcoming camera features, including a portrait mode. The feature appears to use facial recognition A.I. to blur the background. The code also suggests an updated camera interface.
Computing

Google+ continues to sink with a second massive data breach. Abandon ship now

Google+ was scheduled to shut its doors in August 2019, but the second security breach in only a few months has caused the company to move its plan forward a few months. It might be a good idea to delete your account sooner than later.
Social Media

Walkie-talkie voice messaging finally comes to Instagram

In its latest grab from messaging apps, Instagram now lets you send walkie-talkie style voice messages. Apps such as Facebook Messenger, WhatsApp, Snapchat, and iMessage have offered the feature for some time.
Social Media

‘YouTube Rewind 2018’ is about to become its most disliked video ever

YouTube is about to achieve a record it really doesn't want — that of "most-disliked video." Yes, its annual recap of featuring popular YouTubers has gone down really badly this year.