Teenagers will soon face new restrictions on TikTok as the platform moves to ban beauty filters that dramatically alter their facial features. The changes, which are set to roll out in the coming weeks, will block under-18s from using filters that smooth skin, alter skin tone, or enlarge lips and eyes. While playful effects like cat ears or bunny noses won’t be affected, popular tools like the viral “Bold Glamour” filter will fall victim to the ban.

The restrictions come amid growing concerns about how social media negatively impacts self-image among younger users. Beauty filters, which create polished and flawless features, have been shown to exacerbate body image issues, especially in young girls. This concern is part of a wider cultural preoccupation with image, which has seen an influx in viral anti-ageing products and even led to teens hiring make-up artists for their first day of school. Such trends have already led to actions like Sweden’s ban on children purchasing anti-ageing skincare products.

After TikTok announced the changes during a safety forum at its European headquarters in Dublin, the company’s child safety policy lead explained to The Guardian, “We’re aiming for a safety-first approach.” The platform also revealed plans to enhance its age verification systems, including trials of AI-based technology to identify users under 13, who are prohibited from using the app altogether. If successful, these measures could result in thousands of accounts being removed in the UK by the end of 2024.

TikTok’s decision comes as social media platforms receive growing pressure from regulators over their influence on young mental health. The UK’s upcoming Online Safety Act will enforce stricter oversight, with penalties for platforms that fail to protect younger users. Although TikTok already removes millions of underage accounts each quarter, critics have pointed to the inconsistencies in its efforts.

While beauty filters have become central to wider conversations about how social media influences self-esteem, the issue is much bigger. The NSPCC described the age protection move as “encouraging,” but “just the tip of the iceberg,” emphasising that platforms must take greater responsibility for the algorithms that promote harmful content to young users, The Guardian reported.

In a wider trend, other platforms are rushing to implement safeguarding measures. Roblox has introduced restrictions on explicit content for younger players, and Instagram has rolled out parental controls for teenage users to comply with the tighter regulations.