TikTok announced it would disclose its moderation and data practices and reveal its algorithm code to encourage competition — and took a jab at Facebook’s history of gobbling up competing apps.
TikTok CEO Kevin Mayer singled out the social network in a post published on Wednesday, saying that TikTok welcomes competition, but only if it’s fair.
“We think fair competition makes all of us better. To those who wish to launch competitive products, we say bring it on. Facebook is even launching another copycat product, Reels (tied to Instagram), after their other copycat Lasso failed quickly,” Mayer wrote. “But let’s focus our energies on fair and open competition in service of our consumers, rather than maligning attacks by our competitor – namely Facebook – disguised as patriotism and designed to put an end to our very presence in the U.S.”
The jab at Facebook comes hours before Facebook CEO Mark Zuckerberg will appear in a Big Tech hearing alongside Apple, Amazon, and Google to look into whether these companies have broken antitrust laws through their acquisitions.
While TikTok hasn’t faced any antitrust concerns, the app has dealt with threats of being banned in the U.S. over scrutiny of its Chinese origins. TikTok’s move to disclose its algorithms could be a way to avoid these bans.
“We will not wait for regulation to come, but instead TikTok has taken the first step by launching a Transparency and Accountability Center for moderation and data practices,” Mayer wrote. “Experts can observe our moderation policies in real-time, as well as examine the actual code that drives our algorithms. This puts us a step ahead of the industry, and we encourage others to follow suit.”
A TikTok spokesperson told Digital Trends that the Transparency and Accountability Centers will be located in Los Angeles in Washington, D.C., but there’s no exact timeline of when they will open due to the coronavirus pandemic.
Facebook regularly publishes its own transparency reports, including government requests for user data, content restrictions, and data on how the platform takes action against violating content. However, it tells little about the actual codes used to determine this information.