IllustrationMarianne WilsonScience & Tech / NewsScience & Tech / NewsDeepfakes are getting worse, and most are aimed at womenA new report has revealed that 96 per cent of the AI-created videos involve female celebrities being edited into pornShareLink copied ✔️October 8, 2019October 8, 2019Text Brit Dawson Although deepfake technology poses a threat to everyone, it’s becoming particularly dangerous for women. From generating realistic nudes to creating revenge porn, the artificial intelligence-created videos are increasingly being used for evil. Now, a new report has confirmed what we already suspected: women are the victims of the majority of deepfakes. According to a report by Deeptrace Labs, 96 per cent of all deepfake videos feature women, most of whom are celebrities being edited into porn without their consent. When it comes to the top four websites dedicated to hosting this type of sexually explicit video, 100 per cent of the subjects are women, with around 850 individuals targeted. These websites received 134 million views in total, which shows the shocking reach of the altered videos. Quoted in the study, professor and author Danielle Citron said: “Deepfake technology is being weaponised against women by inserting their faces into porn. It is terrifying, embarrassing, demeaning, and silencing. Deepfake sex videos say to individuals that their bodies are not their own, and can make it difficult to stay online, get or keep a job, and feel safe.” Katy Perry deepfakeVia Reddit As well as dedicated deepfake porn sites, 802 of the fake videos were hosted on mainstream adult websites, meaning visitors may not realise the footage they’re watching is doctored. British actresses are targeted the most, followed by South Korean musicians, and American actresses. Despite women making up the majority of deepfake porn, the subjects of AI-created YouTube videos are 61 per cent male, with the content commentary-based rather than pornographic. The lab also revealed that the number of deepfakes online has almost doubled over the past year, jumping from 7,964 in 2018 to 14,678 today. While deepfake technology is widely used against politicians, public figures, and actors, it clearly poses an enormous threat to female celebrities who likely have no idea that altered pornographic footage featuring their faces is being shared online. As the software already exists, there isn’t much that can be done to stop these videos emerging. As artificial intelligence lecturer Haerin Shin told Dazed in June, “there’s no way of controlling it, so it’s (now about) how we implement equally effective measures to prevent (the technology’s) misuse”. Escape the algorithm! Get The DropEmail address SIGN UP Get must-see stories direct to your inbox every weekday. Privacy policy Thank you. You have been subscribed Privacy policy Expand your creative community and connect with 15,000 creatives from around the world.TrendingThe 5 best songs from Drake’s new albums (plural) We listened to all two hours and 40 minutes of Iceman, Habibti and Maid Of Honour, so you don’t have toMusicBeauty10 of the hottest Instagram accounts fusing art, sex and eroticaOnFashionHow On and Loewe are shaping the future of footwear FashionWhy is Americana everywhere right now?Life & CultureIs veganism a privilege? BeautyNude awakening: Meet the young people embracing naturismLife & CultureLauren Scott on life after death, nudes & losing her armArt & PhotographySex, Clubs, Dissent: This photo book presents a history of queer nightlifeLife & CultureThere is nothing more romantic than friendshipEscape the algorithm! Get The DropEmail address SIGN UP Get must-see stories direct to your inbox every weekday. Privacy policy Thank you. You have been subscribed Privacy policy