While consenting adults should be free to do what they want, so long as they’re not hurting anyone, when you’re talking about underage kids, the idea of sexting becomes a whole lot more problematic. That becomes even more concerning when an estimated 61 percent of “sexters” who sent nude images admit that they felt pressured to do so at least once.
Fortunately, artificial intelligence is here to help.
Called Gallery Guardian, the A.I. solution — designed “by parents for parents” — is a new mobile app for iOS and Android, which uses image recognition to protect children from taking or receiving inappropriate images and videos through social media apps. It also aims to open up a dialog by alerting parents, who can pair their phones with those belonging to their kids. The parents don’t get to view the offending content, but the alert system will give them a heads-up so they can intervene. The app additionally offers resources to help teach parents how to approach a conversation that could be incredibly difficult for all parties involved.
“We built a library of pornography to train our algorithm,” Daniel Skowronski, founder and CEO of developer YIPO Technologies, told Digital Trends. “To improve the detection, we also leveraged data sets that were mostly ‘selfie’ style images, since it’s the most common photo style taken with smartphones.”
One potential issue with the concept is that the image processing is not done locally on your device. Skowronski says that the goal is for this to be the case, but the processing power needed to run the algorithm and keep detection levels high would result in too much battery drain on the average smartphone. However, he notes that the company uses the “tightest encryption available on the market today to keep your data and your child’s data safe.” All data is deleted immediately after scanning has taken place.