Teens are so obsessed with TikTok that it’s finally adding parental controls

In an effort to curb how much time younger people spend on TikTok, the popular app just rolled out new parental controls in the UK that allow adults to set how long their kids can use the video app. 

TikTok announced the new Family Safety Mode on Tuesday, February 19, in a blog post. The feature allows parents to control screen-time management, limit who can message their kids, and restrict certain content that may not be appropriate. 

The Family Safety Mode links a parent’s TikTok account with their child’s account, so you’ll need one to use the feature. Digital Trends reached out to TikTok to find out when Family Safety Mode would be available to TikTok users in the U.S., and we’ll update this story when we hear back. 

Since TikTok’s audience is mainly teens, the parental controls would directly affect the app’s largest user base. The new feature is meant to help the app’s younger users practice better “digital well-being.” 

“As part of our commitment to safety, the well-being of our users is incredibly important to us,” wrote Cormac Keenan, TikTok’s head of trust and safety, in the blog post. “We want people to have fun on TikTok, but it’s also important for our community to look after their well-being, which means having a healthy relationship with online apps and services.”

tiktok
AFP via Getty Images

In a little over two years, TikTok has rapidly accumulated more than a billion users and more than 700 million downloads across the globe. TikTok allows users to take short videos of themselves, to which they can attach sounds or music. The app also offers other customization options such as filters, stickers, and special effects, making it popular among younger audiences. 

While people of all ages have used the app since it’s gone viral, a majority of its users are younger than 18. TikTok has been in hot water before because of its underage demographic, so it makes sense the company is implementing even more parental controls. 

Last year, the Federal Trade Commission (FTC) hit TikTok with a $5.7 million fine after the app failed to get parental consent for users younger than 13. Besides illegally gathering information from underage users, the app allowed all users, including kids, to send direct messages and interact with comments on videos. 

Editors' Recommendations