At the beginning of 2017, Facebook began rolling out its fake news identification tools, starting in Germany. Faced with two lawsuits in the country, the social network’s decision to clean up its bogus content outside of the U.S. came as little surprise. In September, Facebook provided an update on its efforts, and claimed its program achieved success. But a couple months later, journalists who have worked with the company are claiming that it’s all “way too little, way too late.”
Facebook originally unveiled its fake news-oriented updates in December, following a mounting backlash that blamed the spread of misleading and hyper-partisan content on its site for “swaying” the U.S. election. The company’s new tools rely on its algorithms, user reporting, and third-party fact-checking organizations that are signatories of Poynter’s International Fact-Checking Code of Principles.
The social network reported that it “worked closely with German officials on a number of initiatives to fight disinformation and make Facebook a safer and more secure environment for genuine civic engagement” throughout the German election. For example, in the months leading up to the election, the social network said that it removed “tens of thousands of fake accounts in Germany.” Moreover, the platform tested Related Articles in order to combat fake news while promoting a “healthy civic discourse.” Facebook also allowed political parties to spell out their platforms and housed all this information in a dedicated tab.
Although these efforts may be commendable, a number of fact checkers have now told the Guardian that Facebook’s unwillingness to disclose quantitative data relevant to the effectiveness of its campaign has presented not only a challenge, but a problem. As Alexios Mantzarlis, director of the International Fact-Checking Network at Poynter noted, “We’re sort of in the dark. We don’t know what is actually happening.” He further added that “the level of information that is being handed out is entirely insufficient … This is potentially the largest real-life experiment in countering misinformation in history. We could have been having an enormous amount of information and data.”
Based on the information that is available, fact checkers are skeptical about results. “I don’t feel like it’s working at all. The fake information is still going viral and spreading rapidly,” said one journalist who spoke anonymously for fear of repercussions. “It’s really difficult to hold [Facebook] accountable. They think of us as doing their work for them. They have a big problem, and they are leaning on other organizations to clean up after them.”
This perspective falls in line with that of an earlier Guardian report, which noted that fact-checks seemed to be mostly ineffective.
Another issue may be the conflict of interest that arises when Facebook pays third-party fact checkers to call out fake news. “The relationship they have with fact-checking organizations is way too little and way too late. They should really be handling this internally. They should be hiring armies of moderators and their own fact-checkers,” one such fact checker said. “By offering this money, which journalistic outlets desperately need, it’s weakening our ability to do any fact-checking of these disinformation purveyors like Facebook.”
But perhaps most concerning is the notion that sometimes, even a debunked Facebook post sees plenty of traffic. As the BBC recently reported, Facebook came under fire for a test that caused comments with the word “fake” to actually land at the top of user news feeds. Indeed, despite fact-checkers’ attempts at labeling as fake a report claiming that the recent violence in Texas had connections to anti-fascist groups, it was still shared more than 260,000 times.
Of course, Facebook has admitted that it has not found a panacea to remedy the problem of fake news. “People want to see accurate information on Facebook and so do we,” the company noted. “As long as there are people seeking to disrupt the democratic process, we will continue working closely with our partners — in government and in civil society — to defend our platform from malicious interference.”
Update: Journalists and fact checkers hired by Facebook express doubts over the effectiveness of the campaign.
- Facebook’s fact-checkers think the platform’s political ads should be vetted
- Facebook wants to ‘strengthen democracy’ with a news tab. What could go wrong?
- Google restricts targeting of political ads ahead of 2020 election
- You’re probably seeing more social media propaganda, but don’t blame the bots
- Facebook details plans to combat election interference on the platform