Via UnsplashScience & TechNewsTikTok reportedly took three hours to tell police of a livestream suicideThe platform reportedly acted to stop the story going viral before notifying policeShareLink copied ✔️February 7, 2020Science & TechNewsTextPatrick Benjamin TikTok reportedly took three hours to notify police after a teenager's suicide was livestreamed on the platform. In February last year a 19-year-old vlogger living in Curitiba, Brazil, took his own life on a TikTok livestream after warning his fans a day earlier that he was planning a special performance. Around 280 people watched the teenager kill himself at 3:23pm on February 21, 2019 on the stream, which continued to show his body for over an hour and a half before the video was taken down. During that time, users posted nearly 500 comments and 15 complaints. TikTok has faced criticism after it took steps to prevent the post from going viral and being reported on other news platforms before it notified the police in Brazil. The Intercept reports that TikTok officials only became aware of the incident at 5pm that same day, at which time they set in motion a PR plan to stop what had happened from making headlines. Talking to The Intercept, an employee of TikTok’s parent company, ByteDance, revealed that in the immediate aftermath the first action was not to alert authorities, but to avoid tarnishing the company’s image, as internal orders were issued to closely monitor other platforms to ensure the story didn’t go viral. Another example of how SM platforms that are frequented by youth need to have an immediate action plan in place when it comes to users who are posting suicidal ideations or self-harm messages or videos. If proven true TikTok should be held accountable https://t.co/3IYYIaBEtB.— Darren Laur - The White Hatter 🇨🇦 - M.O.M, M.G.C (@DarrenLaur) February 7, 2020 Since its release in 2018 TikTok has been downloaded 1.5 billion times and is fast becoming one of the leading social media apps. Its focus on “viral hilarity” has made it one of the most usable and influential platforms, especially among teenagers, but its methods of identifying harmful content are clearly lacking. TikTok is not the only platform failing users with poor content moderation though. Last year, 14-year-old Molly Russell killed herself after viewing self-harm content on Instagram, prompting an outcry and calls for policy change. Molly’s father told the BBC at the time that: “The big platforms really don't seem to be doing much about it.” Instagram responded by saying they would double efforts to remove harmful content on the platform but its CEO Adam Mosseri said that it would take time to fully implement. Dazed reached out to TikTok about their own moderation policies but they have yet to respond. If you’re struggling with mental health issues and you are based in the UK, you can contact the suicide prevention specialists Samaritans here, or if you are based in the US, here. Expand your creative community and connect with 15,000 creatives from around the world.READ MORECould the iPhone 15 Pro kill the video game console?Is Atlantis resurfacing? Unpacking the internet’s latest big conspiracyElon Musk’s Neuralink has reportedly killed 1,500 animals in four yearsCould sex for procreation soon be obsolete?Here are all the ways you can spot fake news on TikTokWhy these meme admins locked themselves to Instagram’s HQ Why did this chess-playing robot break a child’s finger?Twitter and Elon Musk are now officially at warAre we heading for a digital amnesia epidemic?Deepfake porn could soon be illegalMeet Oseanworld, the internet artist tearing up the metaverse rulebookThe worlds of technology and magic are closer than you think