Skip to main content

On World Suicide Prevention Day, Facebook announces it will ban self-harm images

Facebook has announced changes to its policies that are geared toward improving how the social media giant handles suicide and self-injury content. 

Updates include a new Suicide Prevention page that features resources for those who need them, or for those who have a friend that is going through a tough time. The latest additions to Facebook’s Safety Center include resources for the National Suicide Prevention Hotline, the Crisis Text Line, the Veteran/Military Crisis Line, and The Trevor Project, which helps LGBTQ youth. 

Facebook also said it would no longer allow self-harm images and would make it harder for people to search for suicide- and self-harm-related content on its platform and on Instagram. Changes also include the addition of a sensitivity screen on photos that depict healed self-harm cuts. 

The updates come on the same day as World Suicide Prevention Day, which is dedicated to raising suicide awareness. 

“Today, on World Suicide Prevention Day, we’re sharing an update on what we’ve learned and some of the steps we’ve taken in the past year, as well as additional actions we’re going to take, to keep people safe on our apps, especially those who are most vulnerable,” wrote Antigone Davis, the global head of safety at Facebook, in a blog post. 

Facebook said it encourages those who are having these types of thoughts and feelings to connect to people they care about, and the platform acknowledged its role in being a connector for such difficult conversations. 

“Experts have told us that one of the most effective ways to prevent suicide is for people to hear from friends and family who care about them,” the announcement reads. “To help young people safely discuss topics like suicide, we’re enhancing our online resources by including Orygen’s #chatsafe guidelines in Facebook’s Safety Center and in resources on Instagram when someone searches for suicide or self-injury content.”

The platform is also taking a stand against eating disorders by prohibiting content on both Facebook and Instagram that glorifies eating disorders and by providing resources on that topic as well. 

These changes are being implemented after Facebook hosted five meetings in 2019 with experts from around the globe to discuss issues like how to deal with suicide notes posted on the platform. 

“The tools Facebook is rolling out aim both at people who are expressing suicidal thoughts and also guide concerned friends or family members to resources and alternatives and appropriate interventions,” wrote Anna Chandy, the chairperson of the Live Love Laugh Foundation, on Facebook’s Suicide Prevention Page. “People use Facebook widely, so there’s an opportunity to actually connect someone who is struggling to a person they have a relationship with. This is extremely important.”

Allison Matyus
Former Digital Trends Contributor
Allison Matyus is a general news reporter at Digital Trends. She covers any and all tech news, including issues around social…
Game dev on Intel’s unstable CPUs: ‘I might lose over $100K’
Intel's 14900K CPU socketed in a motherboard.

Intel's best processors have been crashing for months, and despite many attempts, the issue is nowhere near being fixed. In fact, the impact might be far worse than we thought.

Original reports about stability issues with the Core i9-13900K and the Core i9-14900K came from PC gamers, but now, we're hearing that they're crashing in servers, too. That can lead to serious damage, with one game dev estimating the instability may cost them up to $100K in lost players.

Read more