Instagram announced on Thursday that it would no longer allow graphic images of self-harm, such as cutting, on its platform. The change appears to be in response to public attention to how the social network might have influenced a 14-year-old’s suicide.
In a statement explaining the change, Adam Mosseri, the head of Instagram, made a distinction between graphic images about self-harm and nongraphic images, such as photos of healed scars. Those types of images will still be allowed, but Instagram will make them more difficult to find by excluding them from search results, hashtags and recommended content.
Facebook, which acquired Instagram in 2012 and is applying the changes to its own site, suggested in a separate statement that the changes were in direct response to the story of Molly Russell, a British teenager who killed herself in 2017.
Molly’s father, Ian Russell, has said publicly in recent weeks that he believes that content on Instagram related to self-harm, depression and suicide contributed to his daughter’s death.
Mr. Russell has said in interviews with the British news media that after Molly’s death, he discovered she followed accounts that posted this sort of “fatalistic” messaging.
“She had quite a lot of such content,” Mr. Russell told the BBC. “Some of that content seemed to be quite positive. Perhaps groups of people who were trying to help each other out, find ways to remain positive.”
“But some of that content is shocking in that it encourages self-harm, it links self-harm to suicide,” he said.
Mr. Mosseri said in the statement that the company consulted suicide experts from around the world in making the decision. In doing so, he said the company concluded that while graphic content about self-harm could unintentionally promote it, removing nongraphic content could “stigmatize or isolate people who are in distress.”
“I might have an image of a scar, where I say, ‘I’m 30 days clean,’ and that’s an important way for me to share my story,” he said in an interview with the BBC. “That kind of content can still live on the site.”
The changes will “take some time” to put in place, he added.
Daniel J. Reidenberg, the executive director of the suicide prevention group Save.org, said that he helped advise Facebook’s decision over the past week or so and that he applauded the company for taking the problem seriously.
Mr. Reidenberg said that because the company was now making a nuanced distinction between graphic and nongraphic content, there would need to be plenty of moderation around what sort of image crosses the line. Because the topic is so sensitive, artificial intelligence probably will not suffice, Mr. Reidenberg said.
“You might have someone who has 150 scars that are healed up — it still gets to be pretty graphic,” he said in an interview. “This is all going to take humans.”
In Instagram’s statement, Mr. Mosseri said the site would continue to consult experts on other strategies for minimizing the potentially harmful effects of such content, including the use of a “sensitivity screen” that would blur nongraphic images related to self-harm.
He said Instagram was also exploring ways to direct users who are searching for and posting about self-harm to organizations that can provide help.
This is not the first time Facebook has had to grapple with how to handle threats of suicide on its site. In early 2017, several people live-streamed their suicides on Facebook, prompting the social network to ramp up its suicide prevention program. More recently, Facebook has utilized algorithms and user reports to flag possible suicide threats to local police agencies.
April C. Foreman, a psychologist and a member of the American Association of Suicidology’s board, said in an interview that there was not a large body of research indicating that barring graphic images of self-harm would be effective in alleviating suicide risk.
Suicide is the second-leading cause of death among people ages 15 to 29 worldwide, according to the World Health Organization. And it was a problem among young people even before the rise of social media, Ms. Foreman said.
While Ms. Foreman appreciates Facebook’s work on the issue, she said that Thursday’s decision seemed to be an attempt to provide a simple answer in the middle of a “moral panic” around social media contributing to youth suicide.
“We’re doing things that feel good and look good instead of doing things that are effective,” she said. “It’s more about making a statement about suicide than doing something that we know will help the rates.”
[If you are having thoughts of suicide, call the National Suicide Prevention Lifeline at 1-800-273-8255 (TALK) or go to SpeakingOfSuicide.com/resources for a list of additional resources.]