Pin It
Deepfake technology is degrading women
IllustrationMarianne Wilson

Deepfakes are getting worse, and most are aimed at women

A new report has revealed that 96 per cent of the AI-created videos involve female celebrities being edited into porn

Although deepfake technology poses a threat to everyone, it’s becoming particularly dangerous for women. From generating realistic nudes to creating revenge porn, the artificial intelligence-created videos are increasingly being used for evil. Now, a new report has confirmed what we already suspected: women are the victims of the majority of deepfakes.

According to a report by Deeptrace Labs, 96 per cent of all deepfake videos feature women, most of whom are celebrities being edited into porn without their consent. When it comes to the top four websites dedicated to hosting this type of sexually explicit video, 100 per cent of the subjects are women, with around 850 individuals targeted. These websites received 134 million views in total, which shows the shocking reach of the altered videos.

Quoted in the study, professor and author Danielle Citron said: “Deepfake technology is being weaponised against women by inserting their faces into porn. It is terrifying, embarrassing, demeaning, and silencing. Deepfake sex videos say to individuals that their bodies are not their own, and can make it difficult to stay online, get or keep a job, and feel safe.”

As well as dedicated deepfake porn sites, 802 of the fake videos were hosted on mainstream adult websites, meaning visitors may not realise the footage they’re watching is doctored. British actresses are targeted the most, followed by South Korean musicians, and American actresses. 

Despite women making up the majority of deepfake porn, the subjects of AI-created YouTube videos are 61 per cent male, with the content commentary-based rather than pornographic. The lab also revealed that the number of deepfakes online has almost doubled over the past year, jumping from 7,964 in 2018 to 14,678 today.

While deepfake technology is widely used against politicianspublic figures, and actors, it clearly poses an enormous threat to female celebrities who likely have no idea that altered pornographic footage featuring their faces is being shared online. As the software already exists, there isn’t much that can be done to stop these videos emerging. As artificial intelligence lecturer Haerin Shin told Dazed in June, “there’s no way of controlling it, so it’s (now about) how we implement equally effective measures to prevent (the technology’s) misuse”.