Facebook says it won’t accept new political advertising in the week leading up to the U.S presidential election on November 3. The move is part of a broader set of measures Facebook is announcing today to tackle election interference and voter misinformation.
“The U.S. elections are just two months away, and with COVID-19 affecting communities across the country, I’m concerned about the challenges people could face when voting,” CEO Mark Zuckerberg wrote in a Facebook post. “I’m also worried that with our nation so divided and election results potentially taking days or even weeks to be finalized, there could be an increased risk of civil unrest across the country.”
Starting on October 27, political candidates and committees won’t be allowed to place new ads on Facebook as well as Instagram. However, the political ads approved before that won’t be affected and will be allowed to run. Advertisers will also be able to modify and spend more money on these existing ads to expand their reach.
This weeklong ban appears to be a compromise as Facebook faces increasing pressure to block political ads altogether like its peers such as Twitter and Reddit, and it remains to be seen how big of an impact it will have. In his defense to not fact-check political ads, Zuckerberg has repeatedly said he doesn’t want to meddle with the truth and wants to leave it up to the voters to decide what content they want to trust.
Zuckerberg in his latest post says that Facebook took this decision since there won’t be “enough time to contest new claims” in the final days of an election. “I generally believe the best antidote to bad speech is more speech,” he added.
In addition to this, Facebook today introduced a new forward limit on its messaging platform, Messenger, to stem the growing tide of viral misinformation. The social network is now rolling an update that will restrict users from forwarding a message to no more than five people at a time.
What’s more, to tackle voter misinformation, Facebook will now take down posts that tell people they will catch the coronavirus if they vote. It will also label content that misleads users by “claiming that lawful methods of voting will lead to fraud.” Candidate or campaign pages that prematurely declare victory will also be flagged, Facebook added in a blog post.
Further, Facebook is expanding its policies on voter suppression to cover and remove explicit as well as implicit misrepresentations about how or when to vote such as posts that say things like “you can send in your mail ballot up to three days after election day.”
Over the last few months, as we inch closer to Election Day, Facebook has faced an avalanche of misleading election content from both official candidates and malicious partisan groups and the situation will likely grow worse in the next two months. The changes announced today could stifle these threats to an extent but only if Facebook acts before it’s too late.
- 2020 forced Big Social to address its flaws, but it’s too late for an easy fix
- The best websites like Craigslist
- Google has an ingenious plan to kill cookies — but there’s one big drawback
- Facebook reportedly has a Clubhouse clone in its sights
- What is Section 230? Inside the legislation protecting social media