Pin It
mental health instagram
Illustration Florence Guan

The government might ban Instagram over self harm content

There have been calls to ‘purge’ all suicide posts

Following on from the tragic death of Molly Russell, a 14-year-old who took her own life, the UK government is urging social media companies to crack down on harmful content.

Given that the teenager had shown no obvious signs of mental health issues her parents were shocked by her death, and inspected her social media activity. In a short BBC documentary, her father said that he found overwhelming evidence that she had been looking at pages that depicted self-harm, and that Instagram had fostered a community around depression that was “fatalistic” rather than supportive.

Now, the family solicitor, Merry Varney, has taken aim at tech companies by saying that Russell’s case was an example of “how algorithms push negative material”, which could be “contributing to suicides and self-harm”.

There have been several studies with damning results that look into social media’s damaging impact on our mental health, and now apps like Instagram could face a ban in the UK if they don’t make improvements.

Speaking on The Andrew Marr Show on BBC, health secretary Matt Hancock said: “If we think they need to do things they are refusing to do, then we can and we must legislate.”

He also called out various other sites in the Sunday Times saying that “Pinterest has a huge amount to answer for”.

“Lots of parents feel powerless in the face of social media. But we are not powerless. Both government and social media providers have a duty to act,” he explained. Hancock has since sent a letter to Twitter, Snapchat, Pinterest, Apple, Google and Facebook to call for “urgent” changes to be made.

In response to the government’s calls to “purge” the platform of suicidal posts Instagram said they “don’t remove certain content” that could aid connection and recovery for those suffering with their mental health.

In 2016, Instagram worked with the National Suicide Prevention Lifeline and the National Eating Disorder Association, to curate pop-ups when users search for hashtags like #selfharm.

If a user posted or sought out harmful content they would be sent a message that read: “Someone saw one of your posts and thinks you might be going through a difficult time. If you need support, we’d like to help”. They are then given options for support which included numbers for local support helplines.

As much as social media can impact and exacerbate mental health issues, the health secretary's response to teen mental health shouldn’t lay blame on tech firms alone since he oversees the crucial services that should help intervene in these cases.

A quarter of 14-year-old girls in the UK are self-harming, and with cuts to the NHS mental health services, there are fewer ways to help. It has led to a crisis in how the Child and Adolescent Mental Health Services are being run with a shocking 60 per cent of young people who are referred go untreated. A Guardian investigation last year revealed a lack of beds which saw some young people forced to travel hundreds of miles away from home to receive treatment. 

Read more about the complexities of Instagram and your mental wellbeing here.