Skip to main content

Teens are so obsessed with TikTok that it’s finally adding parental controls

In an effort to curb how much time younger people spend on TikTok, the popular app just rolled out new parental controls in the UK that allow adults to set how long their kids can use the video app. 

TikTok announced the new Family Safety Mode on Tuesday, February 19, in a blog post. The feature allows parents to control screen-time management, limit who can message their kids, and restrict certain content that may not be appropriate. 

Recommended Videos

The Family Safety Mode links a parent’s TikTok account with their child’s account, so you’ll need one to use the feature. Digital Trends reached out to TikTok to find out when Family Safety Mode would be available to TikTok users in the U.S., and we’ll update this story when we hear back. 

Since TikTok’s audience is mainly teens, the parental controls would directly affect the app’s largest user base. The new feature is meant to help the app’s younger users practice better “digital well-being.” 

“As part of our commitment to safety, the well-being of our users is incredibly important to us,” wrote Cormac Keenan, TikTok’s head of trust and safety, in the blog post. “We want people to have fun on TikTok, but it’s also important for our community to look after their well-being, which means having a healthy relationship with online apps and services.”

tiktok
AFP via Getty Images

In a little over two years, TikTok has rapidly accumulated more than a billion users and more than 700 million downloads across the globe. TikTok allows users to take short videos of themselves, to which they can attach sounds or music. The app also offers other customization options such as filters, stickers, and special effects, making it popular among younger audiences. 

While people of all ages have used the app since it’s gone viral, a majority of its users are younger than 18. TikTok has been in hot water before because of its underage demographic, so it makes sense the company is implementing even more parental controls. 

Last year, the Federal Trade Commission (FTC) hit TikTok with a $5.7 million fine after the app failed to get parental consent for users younger than 13. Besides illegally gathering information from underage users, the app allowed all users, including kids, to send direct messages and interact with comments on videos. 

Allison Matyus
Former Digital Trends Contributor
Allison Matyus is a general news reporter at Digital Trends. She covers any and all tech news, including issues around social…
TikTok users sue to overturn Montana’s statewide ban of app
TikTok logo on an iPhone.

A group of TikTok users has sued the state of Montana in a bid to overturn its plan to ban the app from January 1, 2024.

The complaint was filed on Wednesday evening in the U.S. District Court for the District of Montana just hours after Montana Governor Greg Gianforte (R) signed into law a bill banning the Chinese-owned app over concerns it could impact U.S. national security.

Read more
TikTok faces outright ban in first U.S. state
TikTok icon illustration.

TikTok received more bad news on Wednesday after Montana Governor Greg Gianforte (R) signed into law a bill banning the popular app from January 1, 2024.

While more than half of U.S. states have already issued TikTok bans on government-issued devices, Montana’s action against the Chinese-owned app is significant as it’s the first state to impose a total ban on the app.

Read more
Former ByteDance exec claims China had access to TikTok data
TikTok logo on an iPhone.

TikTok is feeling the heat again after a former leading executive at its parent company, Byte Dance, made a series of damning claims in a wrongful dismissal lawsuit filed recently in the San Francisco Superior Court

Among the allegations made by Yintao Yu was that the Chinese Community Party (CCP) “maintained supreme access” to TikTok data stored in the U.S. when he worked for the company between 2017 and 2018.

Read more