QAnon, the far-right conspiracy group responsible for #PizzaGate, and more recently a conspiracy falsely accusing retailer Wayfair of engaging in child trafficking, has found a home on the fastest-growing social media network in the world: TikTok.
Since December, the hashtag #QAnon has grown from 3 million views to more than 83 million views. TikTok took action on Thursday to make the conspiracy group’s content harder to find by disabling the hashtag on the platform, but it didn’t ban any accounts outright.
QAnon originated out of the 4chan message boards in 2016, but its theories have since come out of anonymity and into the mainstream. The group’s largely older supporters believe in a “deep state” plot against President Donald Trump, but have since latched onto the public health crisis surrounding the coronavirus pandemic, calling COVID-19 a “bioweapon.” Followers ardently believe dispatches from a mysterious figure known as “Q” — purporting to be a person or group within the Trump Administration — which have appeared on 4chan and later 8chan.
Because of social media, QAnon is no longer a fringe movement, and TikTok teens have helped usher its rhetoric out of the shadows of the dark web.
In the past few weeks, TikTok teens resurrected the PizzaGate conspiracy theory, which began circulating during the 2016 presidential election on the baseless claim that Democratic hopeful Hillary Clinton was running a sex-trafficking ring out of a Washington, D.C., pizzeria. The conspiracy theory led to real-world violence when a man fired a gun inside the restaurant’s closet hoping to find victims of child abuse.
But on TikTok, users aggressively analyzed the actions of celebrities like Justin Bieber, insisting that that gestures during livestream videos were clues that he was entangled in the conspiracy — causing the hashtag #PizzaGate to swell to over 80 million views before also being banned.
In July, a new conspiracy accusing furniture company WayFair of trafficking children through pricey industrial cabinet listings took off on TikTok, and for a time, trended high on its “For You” page. The surge in popularity for #WayFairGate was fueled by mainstream influencers, propelling the hashtag to more than 2.5 million views in just a few days. It wasn’t until it received public criticism that TikTok removed the hashtag and added an in-app reminder of its community guidelines.
Digital Trends reached out to TikTok for comment for this story but did not receive a reply. We will update this story when we hear back.
There is no way to tell how many QAnon-related accounts there are on TikTok, nor is there a way to gauge how widespread QAnon theories are on the app — especially since the company has now banned its popular hashtags from being searchable. But that doesn’t mean QAnon content and personalities are especially difficult to find.
Digital Trends found more than two dozen accounts touting QAnon conspiracy theories and hashtags with tens of thousands of views and followers. Users who also promote QAnon-related content and hashtags are also recommended by the app’s “Suggested Accounts” tab. Since TikTok disabled the #QAnon and #wwg1wga tags, users have adopted the #q hashtag to share content.
Joseph Uscinski, a professor at the University of Miami and co-author of American Conspiracy Theories, acknowledged that QAnon content has indeed “blown up on TikTok” but doesn’t believe its ability to go viral every so often is awarding the group with any new followers.
“I’ve been polling the public about QAnon for the last two years and found the beliefs in it are not increasing at all,” Uscinski told Digital Trends. “Most polls are finding that people still don’t know what it is, and it makes me wonder if this activity is coordinated bots or people just joking around as a fun meme.”
“Just because a hashtag is trending doesn’t indicate to me that people believe it,” he added.
Uscinski said what makes QAnon different from other internet conspiracy theories is that QAnon members have a group mentality and hierarchical structure — part of the reason why Twitter banned recently 7,000 QAnon-related accounts was because of their ability to organize coordinated attacks and spread misinformation widely. Yet, the strategies adopted by QAnon, and its quick embrace of TikTok, do not hint that any large-scale conversions are taking place, according to Uscinski.
“They see themselves as part of a group, they take pledges online and call themselves digital soldiers,” he said. “But the support for ideas like QAnon are more deep than they are wide.”
- Inside the strange and scammy world of anti-5G accessories
- How to go live on TikTok
- TikTok has a gun problem, and it’s doing nothing to fix it
- TikTok users are exposing realities of gig work, and Big Tech can’t stop them
- TikTok isn’t going anywhere, despite deadline for sale passing