Instagram said it is banning graphic images of self-harm after objections were raised in Britain following the suicide of a teen whose father said the photo-sharing platform had contributed to her decision to take her own life.
Instagram chief Adam Mosseri said the platform is making a series of changes to its content rules "to keep the most vulnerable people who use Instagram safe."
"We need to do more to consider the effect of these images on other people who might see them. This is a difficult but important balance to get right," he said in a statement on Thursday.
The company said it won't allow graphic images of self harm. It also said it won't show non-graphic, self-harm related content in search or through hashtags, nor will it recommend such content to its users.
However, Mosseri said the company won't ban non-graphic, self-harm content entirely because "we don't want want to stigmatize or isolate people who may be in distress and posting self-harm related content as a cry for help."
The call for changes was backed by the British government after the family of 14-year-old Molly Russell found material related to depression and suicide on her Instagram account after her death in 2017.