Instagram fails to remove graphic self-harm images following complaints 4 years ago

Instagram fails to remove graphic self-harm images following complaints

The social media site received backlash when a father claimed that Instagram was partly responsible for his daughter's suicide.

Social media giant Instagram has come under criticism as they failed to remove graphic images that depicted self-harm on their site.


Sky News set up an account on Instagram, and found that the social media platform continued to host graphic content with hundreds of disturbing posts, despite promising in a statement that there would be "no graphic self-harm or graphic suicide related content on Instagram".

Some of the videos were found to have glamourised suicide and offered instruction on how to do it, while certain hashtags came with no warning about the nature of the content.

Instagram attempted to introduce a ban of this type of content following the death of Molly Russel, a 14-year-old girl who died by suicide in 2017.

Her father, Ian Russel, said that certain images on Instagram "helped kill her", and the family's subsequent campaign forced the social media company to make the aforementioned statement.


Responding to the recent findings, a spokesperson for Instagram told Sky News that there was nothing was more important to the company than keeping the people who use Instagram safe.

They have also removed the graphic content identified in the investigation.

Get immediate support from Pieta House - Freecall 1800 247 247 or you can simply text HELP to 51444.