The platform reportedly acted to stop the story going viral before notifying police
TikTok reportedly took three hours to notify police after a teenager's suicide was livestreamed on the platform. In February last year a 19-year-old vlogger living in Curitiba, Brazil, took his own life on a TikTok livestream after warning his fans a day earlier that he was planning a special performance.
Around 280 people watched the teenager kill himself at 3:23pm on February 21, 2019 on the stream, which continued to show his body for over an hour and a half before the video was taken down. During that time, users posted nearly 500 comments and 15 complaints.
TikTok has faced criticism after it took steps to prevent the post from going viral and being reported on other news platforms before it notified the police in Brazil. The Intercept reports that TikTok officials only became aware of the incident at 5pm that same day, at which time they set in motion a PR plan to stop what had happened from making headlines.
Talking to The Intercept, an employee of TikTok’s parent company, ByteDance, revealed that in the immediate aftermath the first action was not to alert authorities, but to avoid tarnishing the company’s image, as internal orders were issued to closely monitor other platforms to ensure the story didn’t go viral.
Another example of how SM platforms that are frequented by youth need to have an immediate action plan in place when it comes to users who are posting suicidal ideations or self-harm messages or videos. If proven true TikTok should be held accountable https://t.co/3IYYIaBEtB.
— Darren Laur - The White Hatter 🇨🇦 - M.O.M, M.G.C (@DarrenLaur) February 7, 2020
Since its release in 2018 TikTok has been downloaded 1.5 billion times and is fast becoming one of the leading social media apps. Its focus on “viral hilarity” has made it one of the most usable and influential platforms, especially among teenagers, but its methods of identifying harmful content are clearly lacking.
TikTok is not the only platform failing users with poor content moderation though. Last year, 14-year-old Molly Russell killed herself after viewing self-harm content on Instagram, prompting an outcry and calls for policy change. Molly’s father told the BBC at the time that: “The big platforms really don't seem to be doing much about it.”
Instagram responded by saying they would double efforts to remove harmful content on the platform but its CEO Adam Mosseri said that it would take time to fully implement.
Dazed reached out to TikTok about their own moderation policies but they have yet to respond.
If you’re struggling with mental health issues and you are based in the UK, you can contact the suicide prevention specialists Samaritans here, or if you are based in the US, here.