Pin It
Instagram mental health app
via Flickr

Instagram is launching ‘sensitivity screens’ to hide self-harm content

The app is trying to hide certain images from vulnerable young people following the suicide of 14-year-old Molly Russell

Instagram is planning to hide images of self-harm and suicide-related content with ‘sensitivity screens’, according to the social network’s head.

The screens will blur any potentially harmful pictures, which can be shown if a user chooses to view it. This new move, announced by IG’s Adam Mosseri, is part of changes to protect user’s mental health. It follows the death of Molly Russell, a 14-year-old British girl, whose parents believe she took her own life after engaging with self-harm content on Instagram and Pinterest. 

Mosseri, who took over as head of the app in 2018, also plans to meet with UK health secretary Matt Hancock. Writing for the Telegraph, Mosseri says that Instagam intends to start bringing in engineers and trained content reviewers and moderators to combat harmful imagery. He hopes the app will also “better support people who post images indicating they might be struggling with self-harm or suicide”. 

Instagram already blocks hashtags, dedicated accounts and focused searches related to self-harm and suicide. Mosseri added that Instagram is working with charities and organisations like Samaritans and Papyrus to better content any users who are struggling. In 2016, Instagram brought in the option for users to disable comments on their own pictures, and also brought in tools such as supportive pop-ups if people were searching for offensive content, with help from the National Suicide Prevention Lifeline and the National Eating Disorder Association. 

In a recent BBC documentary, Molly Russell’s father said that he found evidence that she had been looking at pages that posted self-harm, and that Instagram had fostered a community around depression that was “fatalistic” rather than supportive. 

UK health secretary Matt Hancock last week warned social media companies including Facebook, Twitter and Google that measures must be put in place to protect young people from offensive and damaging content online. He warned he would use the law to enforce it. 

A study previously showed that a quarter of 14-year-old girls in the UK are self-harming, while 60 per cent of young people referred to UK mental health services left untreated. The strain on UK-wide mental health services continues.

You can read back on our investigation into Instagram and its relation to mental health here.