Skip to main content

Facebook is making its News Feed smarter in its ongoing bid to eliminate clickbait

facebook journalism grants login smartphone
Image used with permission by copyright holder
Facebook is updating its News Feed algorithm in an effort to rid its site of spam and clickbait.

The social network’s content cleanup process began in earnest in 2014 but was renewed in August of last year when it developed its own take on a click-bait targeting system akin to an email spam filter. The move pre-empted the company’s broader strategy to tackle fake news, which kicked off in late 2016.

On Tuesday, Facebook announced it is adding signals to its News Feed algorithm to identify “authentic” posts from people and pages. The implementation process saw Facebook categorize Pages it identified as posting spam or trying to directly ask users for likes, comments, and shares in order to boost content. The company used this data to train a model that continuously detects whether posts from other Pages are likely to be authentic. Facebook claims one indication that a publisher may be posting misleading items is if people keep hiding its posts.

“We anticipate that most Pages won’t see any significant changes to their distribution in News Feed,” writes Facebook in its blog post. “Some Pages might see a small increase in referral traffic or outbound clicks, and some Pages might see minor decreases. Pages should continue to post stories that are relevant to their audiences.”

As a result of the changes, “authentic” content could rank higher in your feed. The social network already ranks items in regard to whether they are “newsworthy” and of interest to users based on personal activity (such as likes, reactions, comments, and shares).

Facebook is also adding real-time signals to its ranking system to help it to spot relevant trends. Now, if a topic or post from a Page is gaining a lot of engagement from users in your network, the News Feed will be able to understand that in real time and temporarily prioritize it. This could result in major breaking news and events posts being shown higher up in the News Feed due to the amount of chatter relating to them. The change builds on Facebook’s recent updates to its Trending feed, which saw it move away from individual posts to focus on improved topic identification.

Saqib Shah
Former Digital Trends Contributor
Saqib Shah is a Twitter addict and film fan with an obsessive interest in pop culture trends. In his spare time he can be…
Study: Facebook is skimping on moderation, and it’s harming the public
facebook ftc fine not enough header

A new report from the New York University Stern Center for Business and Human Rights alleges that Facebook and other social media companies (Twitter and YouTube are also mentioned specifically) are outsourcing too much of their moderation to third-party companies, resulting in a workforce of moderators who are treated as “second-class citizens,” doing psychologically damaging work without adequate counseling or care.

Most disturbingly, the report points out how a lax attitude toward moderation has led to “Other harms -- in some cases, lethal in nature ... as a result of Facebook’s failure to ensure adequate moderation for non-Western countries that are in varying degrees of turmoil. In these countries, the platform, and/or its affiliated messaging service WhatsApp, have become important means of communication and advocacy but also vehicles to incite hatred and in some instances, violence.”

Read more
Facebook-backed Libra cryptocurrency is scaling back its plans
Facebook Libra

After months without any updates, Facebook’s planned cryptocurrency project Libra announced significant changes to scale back its global payment system. 

The Libra Association, which is comprised of Facebook and a diverse lineup of businesses and nonprofit organizations, announced several updates to the project. Most notably, the association is shifting from creating an open global financial network to instead offering both single-currency and multi-currency coins that would be tied to local real-world currencies. 

Read more
Ahead of the 2020 presidential election, Facebook says it’s banning deepfakes
Facebook Chairman and CEO Mark Zuckerberg testifies before the House Financial Services Committee on "An Examination of Facebook and Its Impact on the Financial Services and Housing Sectors" in the Rayburn House Office Building in Washington, DC on October 23, 2019.

Two days before Facebook is set to appear in front of lawmakers at a House Energy and Commerce hearing on manipulated media, the social network has announced it’s banning all forms of deepfakes. The announcement represents a significant step forward for Facebook, which has been struggling to mend its ailing image with the 2020 presidential election right around the corner.

In a blog post, Monika Bickert, Facebook’s vice president of global policy management, said the company will take down videos that have been "edited or synthesized in ways that aren’t apparent to an average person" or are the "product of artificial intelligence or machine learning that merges, replaces or superimposes content onto a video, making it appear to be authentic."

Read more