Everything you need to know about the Facebook Trending Topics bias controversy

facebook quickly turns on safety check following deadly bombing in nigeria mark zuckerberg  ceo at 2
Another year, another Facebook controversy. The company is in the spotlight once again, but this time, it has nothing to do with user data collection or conducting tests on users without them knowing — this time is has to do with Facebook as a news service.

According to recent accusations Facebook has been suppressing conservative news topics from its Trending Topics sections, showing bias on a news service that many people turn to specifically to avoid bias. But what does that mean? And why is it such a big deal? Here’s everything you need to know about Trending Topics and the debate around it.

What are Trending Topics?

Trending Topics
Trending Topics

If you’re a regular Facebook user, you’ve probably seen Trending Topics on your Home page, whether you use it as a source of news or not. The section lives in the upper right hand corner of the Home page, and shows news stories that are popular at the time. The stories could be related to celebrities, current events, or really anything else newsworthy.

The section really is a great way to get a quick look at the news of the day, especially if you’re mainly looking for news that you would be interested in. That’s one of the main areas in which Trending Topics differs from a traditional news outlet — the topics are tailored to you.

A story goes through a number of stages before it becomes a Trending Topic. First, potential stories are found by an algorithm, which identifies stories on Facebook based on mentions on the network over periods of time. The algorithm also uses an RSS reader to identify breaking stories.

Next up, those stories are reviewed by an editorial team, who confirm that the topic is noteworthy, write up a description for the topic, give it a category (sports, science, etc.), and check to see whether the story is national or international news, giving it an importance level based on what they find.

For an in-depth rundown of how Trending Topics work, take a look at this blog post, from Facebook itself.

What exactly is Facebook accused of and why is it a big deal?

As you can see, there’s a lot more human intervention in Trending Topics than you might otherwise have thought. That’s the problem. According to a report from Gizmodo, Facebook workers were routinely told to suppress news stories that might have been of interest to conservative readers, instead feeding other stories to their News Feeds, and thus showing a bias in how Facebook presents news.

“We have a series of checks and balances in place to help surface the most important popular stories.”

According to an unnamed source for the Gizmodo article, workers were routinely told to artificially “inject” stories into Trending Topics, even if those stories weren’t popular enough, or in some cases trending at all.

According to leaked documents obtained by The Guardian, in many cases the intervention of the small editorial team is what determines whether or not a topic is “trending,” not that the topic is actually popular. Not only that, but the documents say that the company has a preferred list of news outlets to check for news stories — including BBC, CNN, the New York Times, and Fox News.

Why does it matter?

“A news outlet showing bias? That’s unheard of!” I know what you’re saying, arguably every news outlet on the planet has some kind of bias, whether it be conservative, liberal, or otherwise. The issue, however, is this: Facebook has grown to become one of the largest news outlets in the world, and people turn to it for news because of its perceived neutrality.

The whole point is that the social media network tailors itself to individual users, whether those users are conservative or liberal. Therefore, allegations such as these, threaten to alienate a significant portion of Facebook’s 1.65 billion user base (over 160 million of whom reside in the U.S.). In particular, it is the timing of the controversy that may be the most damaging, as it comes in the midst of an important news cycle, namely the run-up to the presidential election.

It could also lead to broader accusations of the politicization of a so-called impartial social network. To that regard, the wheels may have already been set in motion. Several conservative news outlets have stated that they believe this is proof that Facebook is “blacklisting” them. As evidence of its alleged bias, they’ve pointed to CEO Mark Zuckerberg’s anti-Trump sentiments at the company’s F8 Developer Conference, and the threats made by Facebook employees, claiming they could derail Trump’s campaign, that followed.

It’s important to note, however, that Facebook states the accusations are false, and that it does not suppress conservative news. Several public statements have been shared on Facebook by various representatives, the most recent by Zuckerberg. The fact that numerous clarifications have had to be issued are evidence of the growing stature of the debate.

Even the U.S. Senate has taken note, urging the social network for more transparency when it comes to its trending topics. In a letter to Facebook, U.S. Senate Commerce Committee Chairman, John Thune, R-S.D, writes: “How many stories have curators excluded that represented conservative viewpoints or topics of interest to conservatives? How many stories did curators inject that were not, in fact, trending?”

Facebook has addressed those questions, and remains adamant that no managerial decisions dictate the editorial stance of its contracted curators.

Why did Facebook hire human curators in the first place?

Facebook Instant Articles

There are a number of reasons that humans need to be a part of the process, at least for now. Humans write things like descriptions for the Trending Topics, and they suppress topics that aren’t newsworthy, something that apparently can’t be left up to machines at this time.

This is standard procedure when it comes to AI or an algorithm. The system needs to be fed a lot of data before it can operate in isolation. In its infancy, humans are required to oversee the data it receives and the results that surface from it. As recently as March of this year, we were reminded of the dangers of allowing an AI to operate freely on a public forum when Microsoft’s Tay bot, which emulated millennial modes of expression, ran amok on Twitter.

We can only imagine the endless amount of data that passes through a globe-spanning network such as Facebook. Therefore, as explained above, it is crucial to have human moderators overseeing the results that surface from its algorithm in order to separate the newsworthy from the personal.

What has generally been excluded from the current debate, is Facebook’s hiring process when it comes to its editorial team, and the working practices in place at the company. This is important if we are to gain a better understanding of what went wrong internally.

The first Gizmodo report was as much about an environment of exclusion for journalists, many of whom were recent graduates, with little or no experience in actual press rooms or within major media organizations. The unnamed sources quoted in the article felt as if they weren’t part of the wider work space, and Facebook community. Overall, the report comes across as a scathing account of life as a contractor at Facebook, and merely skims the surface in regard to the alleged editorial guidelines, which were later expanded upon, and ended up becoming the “real” issue.

Most importantly, at the outset, the journalists hired by Facebook claim that they worked independently, with barely any supervision. They describe themselves as being “slaves” to an algorithm that was being groomed to one day replace them altogether. In our current media landscape, where the threat from social media to traditional and digital news rooms is a real one, this bred an atmosphere of resentment, according to the workers interviewed in the report. Meaning, the human side of the trending topics venture was mired in controversy, and contradictions, from the start.

What does Facebook have to say about it?

Facebook has denied claims that it intentionally shows bias in the Trending Topics section, saying that there are “rigorous guidelines in place for the review team to ensure consistency and neutrality.” The company also issued a blog post about how Trending Topics works, later adding an FAQ section to the blog post.

“We have found no evidence to date that Trending Topics was successfully manipulated.”

“We have a series of checks and balances in place to help surface the most important popular stories, regardless of where they fall on the ideological spectrum, as well as to eliminate noise that does not relate to a current newsworthy event but might otherwise be surfaced through our algorithm,” said the company in its post. “Facebook does not allow or advise our reviewers to discriminate against sources of any political origin, period.”

Facebook also said that it did not allow for the suppression of particular political perspectives, however it does permit the rejection of topics it constitutes as “noise,” or words being used often for a variety of different discussions. For example, the word “lunch” appears everyday, but Facebook wouldn’t promote that as a Trending Topic.

Last but not least, the company said it was investigating the claims made in the Gizmodo story.

“We have found no evidence to date that Trending Topics was successfully manipulated, but will continue the review of all our practices,” it continued.

Mark Zuckerberg himself also weighed in on the discussion, essentially reaffirming what had already been said in the blog post, saying that the company took the reports seriously and that it was investigating the issue. He also said he would be inviting leading conservatives to join the discussion.

Isn’t bias unavoidable and what happens now?

Facebook Trending topics

Arguably, yes. Critics suggest that even if Trending Topics was entirely based on algorithms, it would still be biased because those algorithms would have been built by humans. Still, there’s a way to cut down on bias as much as possible, and suppressing specific topics is not it.

If we look elsewhere, we can see that social media algorithms devoted to trends that involve alternative methods of management produce differing results. Although there has not been a deep dive into Twitter’s approach to trends on its platform, its diverse feed is evidently home to a broad range of topics. Twitter, which pretty much gave birth to the trending feed in 2010, often contains what may be viewed as conservative topics and hashtags. Sometimes, the feed can even surface far-right views under the guise of hashtags — as was the case upon the recent election of Sadiq Khan as Mayor of London, for example — for all to see and participate in, or counteract.

What will happen next?

Unless something else leaks or more people come forward claiming foul play on the part of Facebook, it’s likely that the controversy is more or less over. Mark Zuckerberg himself says that the company will be inviting well-respected conservatives and conservative media outlets to weigh in on the topic. Hopefully initiatives like this will help eliminate any bias that exists.

In the long run, it’s unlikely that this will have any severe implications for Facebook as a news outlet. Most people who currently use Trending Topics as a way to get their news are going to continue to do so simply because its a feature on Facebook.

Facebook has a number of media partners for things like Instant Articles, which are articles hosted on Facebook’s servers that can be quickly and easily accessed by users. Facebook wants there to be more Instant Articles because they ensure that users stay on the Facebook website, where they can be advertised to. If issues surrounding bias balloon, it’s very possible that we’ll see conservative media outlets pull out of any agreements with Facebook — it’s really in Facebook’s best interests to ensure that there isn’t bias for this reason.

As far as the Trending Topics feed, it’s unlikely that we’ll see any huge changes to how it operates on the surface, but ideally it will continue to offer stories that users want — without showing any bias, whether it does now or not.

We’ve reached out to Facebook for comments on the story, and will update this article if we hear back.

Editors' Recommendations