Facebook has reportedly removed one of the most popular Facebook groups associated with the viral, far-right conspiracy theory group QAnon for violating its community guidelines.
The group, Official Q/Qanon, had a following of over 200,000 members and was reportedly removed on August 4 for specific community violations and fringe conspiracy theories that could lead to harm, according to Reuters.
The move follows Twitter’s decision in July to ban 7,000 QAnon-related accounts, citing that the messages shared by the group could potentially lead to harm.
A New York Times report last month said Facebook, Twitter, and other social media sites, have been working together to take similar steps to moderate the growing popularity of QAnon content on each platform — an effort to quell the spread of misinformation and the potential threat of physical, real-world violence, such as 2016’s “Pizzagate” shooting.
This is not the first time Facebook has taken action against the group. In May, the company said it took down 700 accounts and nearly 800 pages for coordinated inauthentic behavior and manipulating public debate, many of which were based out of Russia and Iran.
This appears to be Facebook’s first time taking direct and public action against a group that has proven to use the platform in a militant-style to spread misinformation and hate. However, unlike Twitter, Facebook has yet to publicly announce the group’s ban. Even with Official Q/Qanon removed, the lack of an updated Facebook policy means most of the members will simply turn to smaller groups on the site.
Thanks to social media, QAnon conspiracies are no longer fringe ideas. The group spawned from a rumor in 2017 about supposed efforts to undermine Trump. The “Q” in QAnon refers to a single person, or group, within the administration with access to confidential government information that is said to reveal a plot against the president. Since the coronavirus pandemic, the group has latched onto the public health crisis and turned it into a political debate, discourse that is primarily spread through sites like Facebook and Twitter and often receive viral or trending status — exposing the widely debunked rhetoric to millions of eyes.
Digital Trends has reached out to Facebook for comment and will update this story when we hear back.
Facebook’s decision to remove the Official Q/Qanon group from its platform is unlikely to extinguish the group’s existence, nor prominence in popular culture. According to a Facebook statement to Reuters, the company is monitoring other QAnon groups as “it strengthens enforcement,” but without a ban on the group entirely, members of banned chat groups will merely reform new ones.
Critics and experts have called QAnon members “really good at adapting” to online ecosystems, and several QAnon supporters are running for public office on platforms that represent the conspiracy theories shared within the group.
In recent months, Facebook has been on defense of its infamous “hands-off” approach to moderating content on its platform. When Black Lives Matter protests spread across the country, Facebook chose not to take action against President Donald Trump’s “when the looting starts, the shooting starts” post, even as other platforms did. The decision proved to be damaging: Over 200 advertisers announced an advertising boycott of Facebook. One advertiser noted its decision to join the campaign was because its ad was placed next to a video touting QAnon conspiracies. Facebook addressed the concerns of the public, and its advertisers, in a plan to fight hate speech, but many said it was not enough.
- How Twitter cut activity on QAnon content by half
- What the biggest tech companies are doing to make the 2020 election more secure
- What is QAnon and where did it come from?
- Facebook removes nearly 800 QAnon-related groups, pages, hashtags, and ads
- How to talk to your friends and family about misinformation and conspiracy theories