Skip to main content

Facebook’s comment-ranking system aimed at taming the dumpster fire

Facebook is once again trying to address the dumpster fire that is public comments on its site.

As anyone who has moderated or even just browsed a Facebook page knows, the comments section on public posts is usually a cesspool of hatred, bigotry, spam, and irrelevance. Even by the (low) standards of the internet, Facebook comments are famously awful. And bad comments aren’t only unpleasant to read, they can also actually reduce the credibility of the content they are commenting on.

Now Facebook is introducing a new comment ranking system to attempt to tackle this problem. Comments on public posts made by Pages or people with many followers will be ranked, with the aim of showing the most relevant and highest quality comments at the top. To determine the quality of comments, Facebook will use data from four metrics:

  • Integrity signals, so comments which are against the Facebook community standards or which are click-baity or deliberately provocative will be ranked lower,
  • User surveys about comments, in which users can express opinions about what kind of content they do find useful in a comment,
  • Comment interactions, so comments that are liked, reacted to, and replied to are ranked higher, and
  • Poster controls, so the original poster of the content can hide or delete bad comments to rank them lower and engage with good comments to rank them higher.

This is far from the first time Facebook has tried to address the quality of comments. From new visual designs for comments to emoji comment reactions, the company has tried to improve the appearance of comment sections before. And an experiment with downvoting was an attempt to raise the quality of comments as well as their look.

But the problem with comments may run deeper than something a few cosmetic improvements or ranking algorithms can fix. Facebook has shown itself to be woefully inadequate to the task of moderating content on its platform, with hate speech being allowed to proliferate and fake news spreading like wildfire. The company has been hiring more human moderators but still tends to rely on A.I. for the majority of its moderation, and there are many types of negative content that A.I. can’t catch as it lacks understanding of social context.

The new comment ranking system may help to some extent, but until Facebook tackles the site-wide issues with its platform it will only be a band-aid over a deeper problem.

Editors' Recommendations

Georgina Torbet
Georgina is the Digital Trends space writer, covering human space exploration, planetary science, and cosmology. She…
Mark Zuckerberg says fixing abuse on Facebook is his goal for 2018
Mark Zuckerberg

Mark Zuckerberg has been tackling a personal goal every year for nearly a decade, but this year the Facebook CEO is going to focus on something a bit different -- fixing Facebook. In a post on Thursday, January 4, Zuckerberg listed several of the challenges the platform has faced in 2017 as his focus for 2018, admitting that the network makes too many mistakes, specifically errors in preventing abuse and enforcing policies.

“The world feels anxious and divided, and Facebook has a lot of work to do -- whether it’s protecting our community from abuse and hate, defending against interference by nation states, or making sure that time spent on Facebook is time well spent,” he wrote. “My personal challenge for 2018 is to focus on fixing these important issues. We won’t prevent all mistakes or abuse, but we currently make too many errors enforcing our policies and preventing misuse of our tools. If we’re successful this year then we’ll end 2018 on a much better trajectory.”

Read more
Facebook apologizes after report shows inconsistencies in removing hate speech
Facebook News Feed

Facebook has taken plenty of criticism on the platform's algorithms designed to keep content within the community guidelines, but a new round of investigative reporting suggests the company’s team of human review staff could see some improvements too. In a study of 900 posts, ProPublica reports that Facebook’s review staff was inconsistent about the posts containing hate speech, removing some but not others with similar content.

Facebook apologized for some of those posts, saying that in the 49 posts highlighted by the non-profit investigative organization, reviewers made the wrong choice on 22 of those posts. The social media platform defended 19 other instances, while eight were excluded because of incorrect flags, user deletions or a lack of information. The study was crowd-sourced, with Facebook users sharing the posts with the organization.

Read more
Your Facebook News Feed will soon rank faster loading websites higher
Facebook News Feed

In order to make the browsing experience more enjoyable on your News Feed, Facebook announced it is rolling out an update within the upcoming months. The mobile app will show users more stories that load quickly, pushing ones that may take longer further down in your feed -- giving you relevant content that you will spend time reading.

This update will help Facebook zone in on a variety of different factors when someone clicks on any link in the News Feed through a mobile device. This includes the estimated load time of and general speed of the website, along with your current network connection. If signals show the website will load quickly, that link will appear higher in your feed and bury those that are slower.

Read more