“Why would I not tell you guys what perfume I wear? I think my favourite one I’m using now is probably Nyla by Arabiyat,” says Molly Mae, sitting casually on the floor in a TikTok video. In another, she records herself opening up a perfume cabinet better stocked than a fragrance department, explaining: “The one I’m currently using every day is Nyla by Arabiyat, but I left it at Tommy’s which is why it isn’t in here”. 

The endorsement by Molly Mae helped the perfume, produced by a fragrance house based in Dubai called My Perfumes, go viral. At £17.99 and available from TikTok shop, it is considerably more affordable and accessible than most fragrances, particularly those worn by celebrities. But if that sounds too good to be true, it’s because it is: the videos were fake. In July, Molly Mae posted a video saying that her fans had been scammed into buying what they thought was her favourite perfume. Unbeknownst to them, they’d fallen for a deep fake – an AI version of Molly Mae made as a ploy to get more sales. And it wasn’t just her who became the face of the fragrance, AI versions of Rihanna were also created and circulated.

We have entered an age of AI models and digital influencers, whose uncanny faces are used to sell us everything from Guess jeans to Samsung phones. While some of these “models” are created to look unique, and others have permission to use a real person’s likeness – Vilma Sjöberg and Mathilda Gvarliani were among those who signed up to allow H&M to create their ‘digital twin’ – many are borrowing from or stealing entirely people’s faces. Last year, Nassia Matsa wrote an article for Dazed about having her likeness recreated by AI to promote projects and brands she’d had no involvement in – and been given no compensation or credit for. And now it’s people’s whole identities that are being taken to falsely advertise products.

Habiba, a 21-year-old from Birmingham, recently purchased the Arabiyat perfume for £30, on the basis that it was Molly Mae’s favourite. Having seen multiple videos within the TikTok algorithm, curiosity made her finally try her supposed signature scent. “I thought Molly said it [was her favourite] and I feel so silly for believing it,” she says. “I didn’t realise for weeks after that the video was AI.” While she says the scent is quite nice and long-lasting, it doesn’t stop her from feeling scammed. “I was truly influenced into buying it by believing that celebs wore it. I feel so silly for not doing any proper research.”

Habiba says she’ll think twice before using TikTok Shop in the future or believing celebrity endorsements on social media. “Everyone trusts celebrities as they’re our role models. Molly Mae is well known in the influencer world, which makes her reliable, but I’ll definitely be checking more comments and reviews before buying something next time,” she says. 

55-year-old Amanda from Northamptonshire also fell victim to the fake videos. “I was gutted after I found out it was AI,” she says. “Not only is it a huge waste of money, but I feel embarrassed to have fallen for it.” She says she will now consult her daughter for a second opinion before buying anything that’s endorsed by a celebrity in the future. 

The origins of these celebrity deep fakes are murky. Rather than being posted by a brand or company’s official social media account, the majority are shared by other influencers as reposts or stitched in a greenscreen as a tool to promote the fragrance (TikTok shop works by allowing users, no matter how big their following, to earn money off the products they recommend). It’s unclear whether these people are aware that the videos they are sharing are fake, although many of the posts have comments pointing it out.

Nkaur was one of the many creators describing Nyla as Molly and Rihanna’s favourite perfume. While she didn’t respond to our request to comment on the videos, she responded with things like “still my best seller regardless” or “I don’t make the AI videos”. (Every TikTok seller who used Molly Mae to sell Nyla by Arabiyat refused to comment when asked if they were aware the videos were AI.)

One TikTok user wrote on Nkaur’s video, “You might be selling out ‘all day’ but is it really worth it if you get taken to court by her for using AI-made videos she’s not approved of with her face and voice on to sell products?” But to what extent these videos break the law is still being debated. In December 2024, the UK government began working on a new “right to personality”, which helps to prevent any unauthorised uses of AI-based celebrity likeness of voice. The idea of this law is still in motion, but aims to help protect celebrities from having their identity stolen for commercial gain.

According to TikTok community guidelines, even if content is labelled as AI-generated, media showing a public figure endorsing or selling a commercial product is not allowed. However, people misleading others in videos that are unknowingly made by AI often slip through the net. 

Using AI versions of celebrities to sell products is risky. It often crosses the line into violating the right of publicity. That means using someone’s name, face, voice or image to make money without getting their permission,” says attorney and law expert William K. Holland. “Even if the AI version isn’t perfect, if people can tell who it is, that could be enough for a lawsuit. There’s also a risk of misleading the public. If a buyer thinks a celebrity supports the product because of the ad, that could lead to legal trouble under false advertising laws,” he says. 

The Nyla perfume scam might not seem too harmful in the grand scheme of things, but it’s a cautionary tale of the potential mishaps that we could be heading into in the future. What happens when, instead of fragrance being promoted, it’s a political party or belief? We are entering into a world where technology will have the ability to make anyone say anything, and as viewers, we are failing to recognise reality from AI. Until laws like “right to personality” catch up with technology, we might have to pay the price of misinformation for a lot longer than we can afford to.