Skip to main content

Digital Trends may earn a commission when you buy through links on our site. Why trust us?

Federal investigation into child sexual abuse targets TikTok

The U.S. Department of Homeland Security has reportedly launched an investigation into TikTok over how the platform handles content depicting child sexual abuse and the moderation controls put in place. The agency is looking into the alleged exploitation of a feature called “Only Me” on TikTok that was allegedly abused to share problematic content, something Financial Times claims to have verified in partnership with child safety groups and law enforcement officials.

The Only Me feature lets users save their TikTok videos without posting them online. Once a video’s status has been designated as Only Me, it can only be seen by the account’s owner. In TikTok’s case, credentials of accounts that shared content depicting Child Sexual Abuse Imagery (CSAM) were passed on among bad actors. In doing so, the abusive videos never made it to the public domain and avoided detection by TikTok’s moderation system.

Related Videos
TikTok app home page.

TikTok is no stranger to the problem

This is not the first instance of such a serious probe into TikTok. The number of investigations by the Department of Homeland Security covering the spread of child exploitation content on TikTok has reportedly shot up by seven times between 2019 and 2021. And despite making bold promises regarding strict policy enforcement and punitive action against abusive content depicting children, it appears that bad actors are still thriving on the platform.

“TikTok talks constantly about the success of their artificial intelligence, but a clearly naked child is slipping through it,” child safety activist Seara Adair was quoted as saying. Interestingly, the federal agency banned TikTok on all systems, including phones and computers owned by the department’s information technology systems, in March this year over data security concerns.

This also isn’t the first instance of TikTok hogging attention for the wrong reasons. Last month, a couple of former TikTok content moderators filed a lawsuit against the company, accusing it of not providing adequate support while they handled extreme content depicting “child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder.”

A BCC investigation from 2019 revealed predators targeting children as young as nine years of age with sleazy comments and proposals. Elizabeth Denham, the U.K.’s information commissioner, launched a probe into TikTok the same year over the platform’s handling of personal data belonging to underage users. And given its immense popularity among young users, the option of deleting it is not really as straightforward as Facebook’s.

The risks are increasingly high, with media regulator Ofcom claiming that 16% of toddlers in the age group of three to four years consume TikTok content. As per the U.K.’s National Society for the Prevention of Cruelty to Children (NSPCC), online grooming crimes reached a record high in 2021, with children being at particularly high risk. Even though Instagram and Snapchat are the preferred platforms for predators, reports of horrific child grooming on TikTok have surfaced online on multiple occasions in the past few years.

TikTok has lately enforced measures to keep its young user base safe. Last year, TikTok announced that strangers will no longer be able to contact TikTok accounts belonging to children below 16 years of age, and their accounts will default to private. The short video haring platform even tightened the restrictions around downloading videos posted by users under the age of 18. TikTok also added resources to its platform to help sexual assault survivors last year, bringing in experts from the Rape, Abuse & Incest National Network (RAINN) and providing quick access to the National Sexual Assault Hotline.

Editors' Recommendations

Instagram founders open Artifact news app to everyone
The Artifact news app.

Instagram’s a hard act to follow, but the app’s two creators are having a go with a new effort called Artifact.

Kevin Systrom and Mike Krieger released the news app at the end of January, but those interested in checking it out had to join a waitlist to access it.

Read more
Tile’s new feature stops thieves from finding its trackers
portable tech gadgets Tile tracker

Tile today is announcing a new Anti-Theft Mode for their Tile trackers. The new update will allow you to track your items surreptitiously without notifying thieves that they are being tracked by way of an audible alert or pop-up notification. The company says it has also added new tools to mitigate the risk of stalking.

The new Anti-Theft Mode works by making Tile trackers invisible to Scan and Secure, the company's tool for finding Tile trackers that are around you. It's rolling out today and over the next few weeks. Tile's Scan and Secure works in contrast to AirTags which proactively notify you of AirTags traveling with you (on iPhones, at least) by having the user actively prompt for it.

Read more
This Valentine’s Day, experts have 4 tips on how to avoid dating app scams
Dating scam.

The fest of love is nigh, and so is the risk of becoming the next victim of a dating app scam. Social media is brimming with videos of how you can get the best out of this Valentine’s Day, but only a few are talking about the bad actors that are ready to scam some gullible lovelorn soul on online dating apps.
It’s a huge business, and every year, people do end up heartbroken -- along with having a significantly lighter bank account. If you’ve been waiting for some credible research and scientists to give you valuable counsel, your wish has been granted. The folks over at Georgia State University teamed up with experts in criminal justice and cybersecurity to study the world of “romance fraud.”
All the research data was collected from victim testimonials and online platforms, amounting to over 10,000 such reports. What follows are some of the key findings that came out of the research.

Hold on to your emotions
Created using Dall-E 2

Read more