The social media platform’s head Adam Mosseri has admitted the app is ‘too slow’ in addressing harmful content
By now, it’s common knowledge that Instagram can bring out the worst in people. Whether that is influencers using the site of nuclear reactor explosion Chernobyl for likes, gap yah rahs taking photos next to drugged tigers or tobacco advertisers targeting young people using paid content, things can get pretty dark. Even Madonna’s feeling its effects.
Fortunately, the social media platform is finally beginning to see the reality of its Frankenstein creation (namely, that people suck), with Instagram big boss Adam Mosseri telling Radio 1 Newsbeat that he hopes to crack down on bullying across the platform.
The social media head has admitted that he doesn’t want people to “get depressed” but that Instagram “can’t solve bullying on its own” or “stop people from saying nasty things”. “Bullying has existed for a long time, it has changed and evolved with the internet. Like many other issues, bullying is broader than just Instagram and I think that sometimes gets missed,” he told Radio 1.
While Mosseri has admitted that criticism surrounding the app is uncomfortable to hear, he admits that Instagram is “too slow” in addressing harmful content and acknowledges that any feedback is “healthy” and necessary to bettering the experience of people using the app.
“Sometimes it’s not comfortable for us to be criticised and to have our mistakes aired in public but I think fundamentally it is a healthy dynamic,” he says. “Research – whether it’s coming from academics, regulators, politicians – we think it’s fundamentally a good thing.”
“We were under-focused on the downsides of connecting people. Technology is not good or bad, it just is,” he explained.
Earlier this year Mosseri met the UK health secretary to discuss security and safety on Instagram, following the suicide of teenager Molly Russell whose death was linked to her exposure to self-harm material on the app.
The social media head announced it would be adding sensitivity screens to blur images of self-harm and suicide to protect its users. Mosseri also said there would be a stronger focus on shutting down harmful content. Given the strict rules surrounding the platform’s censorship on nudity – which is arguably less harmful that images of self-harm – it’s about time.
“We prioritise different types of content problems more than others because not every piece of problematic content has the same risk,” he said. “So a piece of content talking about self-harm might be much more important to reach quickly than a piece of content that might just be nudity.”
While acknowledging the dangers of the app, Mosseri maintains that there are still many positives. “There’s lots of good that comes out of connecting people. When we started we were very focused on that good and I still believe in that,” he said. “Social media specifically is a great amplifier of the good and bad and so we need to try and do more and identify the bad.”
“I think there’s a limit for how much time you should do anything, it doesn't matter if it’s TV, Instagram or exercise,” he concluded.