Facebook missed a troubling design flaw in its Messenger Kids app that allowed children to communicate with users who hadn’t been approved by their parents.
The social networking giant launched the app in 2017, touting it as a way for children under 13 to “safely video chat and message with family and friends.” Parents set up Messenger Kids by authorizing it through their own Facebook account and then selecting the users with whom they’re happy for their child to connect.
But this protection somehow wasn’t in place for group chats, The Verge discovered, meaning children were able to communicate with users who hadn’t been parent-approved.
Facebook said on Monday, July 22 that it has been sending out “thousands” of alerts to parents over the past week, explaining that it’s aware of the security flaw and has closed down affected group chats.
“We recently notified some parents of Messenger Kids account users about a technical error that we detected affecting a small number of group chats,” a Facebook spokesperson told Digital Trends. “We turned off the affected chats and provided parents with additional resources on Messenger Kids and online safety.”
Messenger Kids only lets children select from approved users for one-to-one chats. They can also enter a chat group that has been started by an approved user. But until Facebook spotted the bug last week, that chat group could contain other individuals approved by the parent of the person that started the chat group, though not approved by the parent of the child entering that group.
In such cases, children could have been talking to someone in a chat group whom the parent knew nothing about. While everyone in the group should have been approved by someone, the flaw is likely to concern many parents as the app’s security was not as tight as Facebook had claimed.
The social networking giant is yet to offer more detailed information on the incident, including how long the issue was present in the app.
The unsettling revelation comes in the same week that the U.S. Federal Trade Commission is expected to announce a colossal $5 billion settlement with Facebook over its handling of data belonging to 87 million users caught up in the Cambridge Analytica privacy scandal. Whether there will be any fallout in response to this latest slip-up remains to be seen.
- Facebook ordered to pay $650 million in facial recognition lawsuit
- India bans TikTok and 58 other Chinese apps over data privacy concerns
- Should you delete TikTok? Only if you’re also going to delete Facebook
- Facebook boss reportedly ‘really worried’ over possible TikTok ban
- Facebook is training an army of malicious bots to research anti-spam methods