Skip to main content

Facebook yanks large QAnon conspiracy group off platform

Facebook has reportedly removed one of the most popular Facebook groups associated with the viral, far-right conspiracy theory group QAnon for violating its community guidelines.

The group, Official Q/Qanon, had a following of over 200,000 members and was reportedly removed on August 4 for specific community violations and fringe conspiracy theories that could lead to harm, according to Reuters.

The move follows Twitter’s decision in July to ban 7,000 QAnon-related accounts, citing that the messages shared by the group could potentially lead to harm.

A New York Times report last month said Facebook, Twitter, and other social media sites, have been working together to take similar steps to moderate the growing popularity of QAnon content on each platform — an effort to quell the spread of misinformation and the potential threat of physical, real-world violence, such as 2016’s “Pizzagate” shooting.

This is not the first time Facebook has taken action against the group. In May, the company said it took down 700 accounts and nearly 800 pages for coordinated inauthentic behavior and manipulating public debate, many of which were based out of Russia and Iran.

This appears to be Facebook’s first time taking direct and public action against a group that has proven to use the platform in a militant-style to spread misinformation and hate. However, unlike Twitter, Facebook has yet to publicly announce the group’s ban. Even with Official Q/Qanon removed, the lack of an updated Facebook policy means most of the members will simply turn to smaller groups on the site.

Thanks to social media, QAnon conspiracies are no longer fringe ideas. The group spawned from a rumor in 2017 about supposed efforts to undermine Trump. The “Q” in QAnon refers to a single person, or group, within the administration with access to confidential government information that is said to reveal a plot against the president. Since the coronavirus pandemic, the group has latched onto the public health crisis and turned it into a political debate, discourse that is primarily spread through sites like Facebook and Twitter and often receive viral or trending status — exposing the widely debunked rhetoric to millions of eyes.

Digital Trends has reached out to Facebook for comment and will update this story when we hear back.

Facebook’s decision to remove the Official Q/Qanon group from its platform is unlikely to extinguish the group’s existence, nor prominence in popular culture. According to a Facebook statement to Reuters, the company is monitoring other QAnon groups as “it strengthens enforcement,” but without a ban on the group entirely, members of banned chat groups will merely reform new ones.

Critics and experts have called QAnon members “really good at adapting” to online ecosystems, and several QAnon supporters are running for public office on platforms that represent the conspiracy theories shared within the group.

In recent months, Facebook has been on defense of its infamous “hands-off” approach to moderating content on its platform. When Black Lives Matter protests spread across the country, Facebook chose not to take action against President Donald Trump’s “when the looting starts, the shooting starts” post, even as other platforms did. The decision proved to be damaging: Over 200 advertisers announced an advertising boycott of Facebook. One advertiser noted its decision to join the campaign was because its ad was placed next to a video touting QAnon conspiracies. Facebook addressed the concerns of the public, and its advertisers, in a plan to fight hate speech, but many said it was not enough.

Editors' Recommendations

Meira Gebel
Meira Gebel is a freelance reporter based in Portland. She writes about tech, social media, and internet culture for Digital…
Facebook is cracking down on platform abusers with a pair of new lawsuits
The Facebook home page on a screen.

Facebook is taking action against those who abuse its platform with a pair of new lawsuits. 

On Thursday, the social network announced that it filed two separate lawsuits against developers for violating the company’s terms of service. The lawsuits show that Facebook is trying to send a clear message to companies that abuse its platform.

Read more
What is QAnon and where did it come from?
QAnon supporter with Q flag

As the 2020 election heats up, social media sites and tech companies have had to fend off disinformation campaigns designed to influence the vote. But this year in particular, tech giants have a new -- and quickly spreading -- concern: The QAnon conspiracy theory.
What is QAnon?
QAnon is a baseless far-right conspiracy theory that originated from an anonymous figure designated as “Q” on the message boards of 4chan in October 2017. 

The unknown figure claims to be a high-level government official within the Trump administration with access to classified intelligence, dropping intermittent, coded hints for supporters to decipher. When 4chan was disbanded in 2018, the Q “drops” moved to 8chan, and now are reported to take place on 8kun. 

Read more
Facebook removes nearly 800 QAnon-related groups, pages, hashtags, and ads
QAnon conspiracy theorist holds a sign

Facebook took down nearly 800 groups associated with the far-right conspiracy theory group QAnon on Wednesday, as well as more than 1,500 advertisements and 100 pages tied to the group in a move to restrict "violent acts."

In a blog post, Facebook said the action is part of a broader "Dangerous Individuals and Organizations" policy measure to remove and restrict content that has led to real-world violence. The policy will also impact militia groups and political protest organizations like Antifa.

Read more