Instagram said on Sunday, October 27 that it is banning fictional depictions of self-harm or suicide from its platform, including drawings or memes and content from films or comics that use graphic imagery. It will also remove any post that includes associated material regardless of whether it shows self-harm or suicide imagery directly.
The move follows the company’s decision in February 2019 to prohibit the uploading of graphic photos of self-harm to its platform.
Instagram has come under increasing pressure to deal with such imagery following a high-profile case in the U.K. involving 14-year-old Molly Russell who killed herself in 2017 after viewing graphic material on the site. Her father, Ian Russell, believes the platform is at least partly responsible for her death.
In a message posted on Sunday, Instagram boss Adam Mosseri described suicide and self-harm as “difficult and complex topics,” but added that “there are many opinions about how best to approach them.”
Mosseri continued: “The tragic reality is that some young people are influenced in a negative way by what they see online, and as a result they might hurt themselves. This is a real risk. But at the same time, there are many young people who are coming online to get support with the struggles they’re having — like those sharing healed scars or talking about their recovery from an eating disorder. Often these online support networks are the only way to find other people who have shared their experiences.”
He said that after seeking advice from academics and mental health organizations, Instagram was seeking to strike “the difficult balance between allowing people to share their mental health experiences while also protecting others from being exposed to potentially harmful content.”
Mosseri acknowledged that there is still much work to be done in the area, but noted that in the three months following its initial policy change in February 2019, Instagram removed, reduced the visibility of, or added “sensitivity screens” that blur images to more than 834,000 pieces of content, and was able to locate more than 77% of that content before it was reported by users.
Speaking to the BBC, Ian Russell, who has been campaigning for Instagram to be more robust in the way it handles sensitive content, described the platform’s latest move to ban fictional depictions of self-harm and suicide as a “sincere” effort, but wanted to be sure the company would deliver on its promises as it continues to tackle the issue.
Facebook, which owns Instagram and has taken similar action to deal with self-harm and suicide-related imagery, recently used World Mental Health Day to launch new Messenger filters and stickers aimed at encouraging conversations and support for those in need of help.
- TikTok is banning campaign fundraising on its app
- Instagram further restricts content for new accounts under 16
- Instagram boss says to expect even more video content
- Instagram expands subscriber features to include DMs, Reels
- TikTok adds Twitter- and Instagram-like content control tools