Artificial intelligence can pick up on patterns — including patterned behavior that could indicate when someone is considering suicide. On Monday, November 27, Facebook shared additional measures that the platform is now integrating to help users that may be expressing thoughts of suicide. The update includes A.I. to detect posts with potential suicidal behavior as well as dedicating more human review staff to the task.
Over the last month, Facebook’s updated algorithms have sparked more than 100 calls to first responders to conduct wellness checks and those numbers don’t include posts that were flagged by users. Facebook is now beginning to roll out the A.I. suicide detection system to additional countries outside the U.S. The social media platform says the program will eventually be worldwide, with the exception of the European Union because of privacy laws.
The new algorithms use pattern recognition to flag posts that could be expressing thoughts of suicide, Facebook says. Along with flagging written posts, the tech works to look for signals inside live video and pre-recorded video as well. The program uses data inside the post itself as well as the comments. Facebook says that comments like “are you OK?” or “can I help?” are often strong indicators.
Facebook said it will be continuing to develop the algorithm to avoid false positives before the post is seen by the human review team. At the same time, artificial intelligence is also being used to prioritize which posts are seen by human reviewers first, which Facebook says is improving their ability to alert first responders quickly.
Updates also help the review staff quickly assess the post, Facebook says, including tools that help reviewers see which portion of a video got the most reactions. Automation also helps the team quickly access information to contact the appropriate first responders.
The update continues to expand the measures Facebook already had in place, including tools for friends to report a post or reach out to the user, worldwide teams working on those reports and tools developed in collaboration with organizations focussed on mental health. Facebook added a number of different suicide prevention tools over the years, including an update earlier in 2017 that brought live chat with groups like the National Suicide Prevention Lifeline.
- A.I. perfectly predicted last year’s Super Bowl score. What happens to betting?
- Thanks to A.I., there is finally a way to spot ‘deepfake’ face swaps online
- Can an algorithm be racist? Spotting systemic oppression in the age of Google
- A.I. is ready to advise us on how to best protect Earth from deadly asteroids
- Facebook tests Reddit-style downvote button to crowdsource comments