Pro-anorexia communities and explicit drug use are proliferating on the app – teen Tiktokers explain why TikTok needs to do better to protect them
Just last month, TikTok announced the opening of its very first Transparency Centre, an LA location which will allow ‘outside experts’ the opportunity to take a look at the company’s moderation process. With reports that the platform banned pro-LGBTQ+ content and censoring posts by creators deemed ‘ugly’, poor, or disabled, it shouldn’t come as much surprise to find the company eager to re-establish a relationship with its diverse community and general public.
But how do the app’s ever-growing community of young users continue to navigate the often triggering content found online? After all, of TikTok’s impressive 800 million users, a substantial 41 per cent of them sit in the 16-24 age bracket. More commonly known as ‘Generation Z’, this community of young people are more likely to experience depression, self-harm, and poor body image compared to the generation that came before them. And, though ‘trigger warning’ is a phrase that hit the mainstream with feminist bloggers in the mid 2010s, it remains a tool that many social media users utilise to warn their followers of content that could facilitate anxiety, panic, or self-harm.

Lindsey is a bubbly 16-year old TikToker from North Carolina who regularly struggles with triggering content on the app. She lives with autism and anxiety, and has suffered with an eating disorder from a young age: “I used to use Tumblr a lot, but because of the toxicity of its ‘pro ana’ content I actually left the website,” she tells Dazed over Facetime. “I’m sad to see an influx of this now transfer onto TikTok.” She describes what she sees as an ‘eating disorder community’ forming on the app, and doesn’t believe TikTok has done enough to curb it.
“It’s mainly people in recovery,” she continues, “they’ll be posting ‘what I eat in a day’ videos and their food content will very clearly not be enough. When you have an eating disorder, you know that it’s competitive. You know that by posting that, people are going to see it and want to be at the level you’re at. So it’s one thing to come out and say ‘hey, I need support’, and another thing to tell people that you’ve eaten three almonds and two grapes today.”
“I used to use Tumblr a lot, but because of the toxicity of its ‘pro ana’ content I actually left the website... I’m sad to see an influx of this now transfer onto TikTok”
Lindsey recounts a recent encounter with some disturbing content, related to the TikTok trend that features the Beach Bunny song “Prom Queen”. “The song basically goes, ‘shut up, count your calories, I’ve never looked good in mom jeans’, and people started pointing out their body insecurities to the song. But the song is legit about calorie counting and eating disorders, so whenever I heard that song I would automatically start to freak out.” Seeing this trend all over her feed contributed to a recent eating disorder relapse that put her back in therapy.
In the UK, 17-year-old Amy* and 16-year-old Hannah have both felt pressured to stay on the app because of friends, despite their discomfort with viewing triggering content that hasn’t been signposted.
Amy struggles with PTSD and a severe anxiety disorder, often finding herself specifically triggered by content that alludes to her past experience with drugs. She considers the ‘glamourisation’ of drug use to be the driving force behind it. “People will be filming themselves on drugs at a rave with the caption ‘sorry mum’ or tag their videos ‘#skinslife’ and tell us how they rely on drugs to keep them happy,” Amy says. She talks about comedy skits like ‘when the pill hits’ videos, and recounts the time a video of a TikToker describing a bad acid trip threw her into a major anxiety attack.

Similarly, Hannah has found scrolling through TikTok worsens her anxiety and “causes a downward spiral of obsessive thoughts and fears which has led to a panic attack”. While she appreciates that many users are grateful for content that raises awareness and recognition of mental health issues, she says a significant number of mental health-related videos she has watched do more harm than good for those going through it and, in some cases, they are “romanticising and glamorising mental health issues”.
But does the problem lie with TikTok or with the concept of trigger warnings themselves? In a statement provided to Dazed from TikTok, a spokesperson refers to its community guidelines to explain the moderation process.
“Protecting the wellbeing of our community is extremely important to us, and we take this responsibility seriously,” the spokesperson says. “Our Community Guidelines make clear what is not acceptable on TikTok and we use both technologies and human moderation teams to identify, review and remove any dangerous content that violates these guidelines.
“We are constantly evolving our protective measures as part of our continuous commitment to maintain TikTok as a platform for safe and positive creative expression.”
“They’re censoring naked bodies but letting people promote the mental disorder with the highest mortality rate”
Searching for terms such as “eating disorder” and “pro ana” on TikTok offer up either no results or a pop-up page entitled “Need Help?”, which signposts to Samaritans and how to get support from professionals.
Other social media platforms have made moves to protect its more vulnerable young users – following the suicide of a teen whose parents said she had engaged with self-harm content on Instagram, IG implemented “sensitivity screens” as a barrier to suicide-related posts, and blocked hashtags like #proana and #selfharm.
Dr Daria Kuss, an expert in cyber psychology, gives Dazed her take on the issue of online trigger warnings. “They can be viewed as compassionate because they enable vulnerable individuals to mentally prepare themselves for content on social media, but it could be claimed that they’re also overprotective and can denote a limitation on freedom of speech.” She highlights that, even when trigger warnings may be present, “young individuals are curious, leaving them likely to act impulsively and on a whim. There’s also the thrill of engaging in things they should not be engaging in, adding to the motivations of viewing the content despite trigger warnings being provided.”

It may be fair to say that therein lies the problem. Despite the presence of trigger warnings, many young people are inclined to go ahead and view the content anyway. Dr Kuss suggests that the only way forward here is an open dialogue between parents, friends, and teachers to ensure that Gen Z know that “not everything that is available on TikTok is healthy or safe to replicate at home.” And, though TikTok currently offers an optional restricted mode, the moderators are constantly looking to improve their approach to sensitive topics on the app.
Lindsey, Amy, and Hannah ultimately feel let down. Lindsey tells Dazed her wider friendship group has been affected by the aforementioned censoring of LGBTQ+ and ‘ugly’ creators: “It’s like, they’re censoring naked bodies but letting people promote the mental disorder with the highest mortality rate.”
Where does the app strike a balance between freedom of speech and fostering online spaces for mental health recovery while also removing triggering content? As of this week, TikTok has announced plans to amp up its parental controls with a new feature that lets parents remotely set restrictions on their kids’ accounts. And hopefully, with the input of its creators and external feedback on its moderation process, the app has the potential to become a safer space for young users.
*Some names have been changed