TikTok, the app where users create their own karaoke-style music videos, is now facing a record $5.7 million fine after failing to get parental consent for users under age 13. The Federal Trade Commission (FTC) on Wednesday, February 27, said the fine is the largest civil penalty ever imposed for children’s privacy violations. In response, TikTok launched a separate kids app limiting users public data.
The Children’s Online Privacy Protection Act prevents websites and web-connected apps from gathering information from kids under age 13 without parental consent. But TikTok never got that consent, according to the FTC, and collected names, emails, phone numbers, biographies, and profile pictures of children. TikTok, formerly known as Musical.ly, uses a public privacy setting for profiles by default, though users could go in and switch to a private account.
Besides illegally gathering information from underage users, the app allowed all users, including kids, to send direct messages and interact with comments on videos. The formal FTC complaint includes reports of adults contacting children through the app. A feature that was discontinued in October 2016 also showed users what other users were nearby.
The record-breaking $5.7 million dollar fine is being used as a warning for other platforms that asking users for their age — and then getting parental consent for anyone under age 13 — isn’t a law to be taken lightly. “The operators of Musical.ly –now known as TikTok — knew many children were using the app but they still failed to seek parental consent before collecting names, email addresses, and other personal information from users under the age of 13,” FTC chairman Joe Simons said. “This record penalty should be a reminder to all online services and websites that target children: We take enforcement of COPPA very seriously, and we will not tolerate companies that flagrantly ignore the law.”
In addition to the fine, TikTok will also be removing all videos uploaded by users under age 13.
Users under age 13 will now be redirected to a separate app. The new app, TikTok says, doesn’t allow users to share personal information and has more limitations on content and interaction.
“While we’ve always seen TikTok as a place for everyone, we understand the concerns that arise around younger users,” the company wrote in a blog post. “In working with the FTC and in conjunction with today’s agreement, we’ve now implemented changes to accommodate younger US users in a limited, separate app experience that introduces additional safety and privacy protections designed specifically for this audience.”
TikTok has an estimated 200 million downloads, with around 65 million users in the U.S. The company says the app “allows users from all walks of life to be their authentic selves” in the platform designed for sharing lip-syncing videos.
- How to go live on TikTok
- TikTok has a gun problem, and it’s doing nothing to fix it
- TikTok isn’t going anywhere, despite deadline for sale passing
- Snapchat’s new TikTok-like feature will share $1M among the best creators
- The TikTok ban: Everything you need to know