Pin It
galgadot

How AI-assisted porn could use your selfies against you

Redditors are using AI software to create fake hardcore porn using the faces of famous celebrity women – in an era where we upload everything it could be dangerous

It feels like not a day - or even an hour - goes by that we aren’t warned about the dangers and inherent threats of technology. Will there be a robot uprising? Could hackers start a nuclear war? Will sex robots replace intercourse between humans? Worrying yes, but these concerns often seem so farfetched and apocalyptic that it’s more comforting not to give them airtime. But the very real and insidious implications of today’s available technology and artificial intelligence is worth talking about.

Towards the end of 2017, Motherboard reported that a Redditor who goes by the name of ‘deepflakes’ used AI-assisted tools to create a video of Gal Gadot having sex with her brother. Except it’s not actually Gal Gadot, but instead footage depicting an incestuous scene with the actress’s face essentially face-swapped onto the body of a porn performer using a machine-learning algorithm. Although the effect achieved is far from perfect, it is convincing enough. And it poses a potential threat to just about anyone.

If ‘fake news’ was the phrase of 2017, then this latest manipulation of technology seems a fitting way to have ended it. It’s true that Redditors, hackers, tech junkies – in essence, anyone with an interest in machine learning – have been using software to forge photographs and videos to share online since the internet began, but usually programmes like Photoshop have been the go-to, which are time-consuming and laborious. AI algorithms are Photoshop gone nuclear.  

The AI software used by ‘deepflakes’ is not only pretty simple, in that it can be self-taught, but it also automates the manipulation and speeds up the process. At a time when the truth is fast becoming a fraught concept and there is a waging war against experts, such software could produce persuasive fake imagery and videos that go as far as swaying elections, dismantling progressive social movements, and providing material for hate groups.

“In the context of revenge porn, a malicious form of online abuse, AI tools are an obvious risk factor. Most of us don’t think twice about uploading photographs of ourselves to social platform”

In the context of revenge porn, a malicious form of online abuse, AI tools are an obvious risk factor. Most of us don’t think twice about uploading photographs of ourselves to social platforms, making sure we share that selfie when our outfit was looking fire without considering any potential repercussions. ‘Deepflakes’ combines open-source AI software, including Google’s TensorFlow, with readily available stock images and photographs from Google’s search to create fake hardcore porn videos. He has previously pasted the faces of celebrities like Scarlett Johansson, Taylor Swift, and Maisie Williams onto X-rated clips.

And what’s to say that it couldn’t happen to anyone with a collection of images online? Revenge porn is used to blackmail, harass and humiliate victims, and its authenticity, or lack thereof in this case, doesn’t necessarily lessen the damage. Children and teenagers, in particular, are vulnerable to this kind of abuse. Think back to playground rumours and MSN chat; it didn’t matter if the rumour was true or not, it just mattered that it was sensational enough to share around. Research by the Cyberbullying Research Center shows that over 50 per cent of victims of revenge porn considered committing suicide, therefore any computer programmes that create an illusion of sexual content have the capacity for harm.

The Gal Gadot case also raises questions about the future of the porn industry and the agency of those who work within it. Keeping filmmakers and porn performers in business is already extremely difficult with the prevalence of free online porn, and AI-manipulated footage could offer economic solutions to an industry on its knees. Large numbers of actors working in porn may not be essential if computer software can manipulate, and even fabricate, sexual content. If Carrie Fisher can be digitally animated in the new Star Wars films, the possibilities are seemingly endless for production companies trying to cut financial corners while still making high volumes of pornography.

Of course, there will be some purveyors of morality who will argue this as a win. Women (and men) will no longer have to exploit themselves and sell their bodies in exchange for money, the narrative will inevitably go. Sex work is habitually painted as victimhood with no agency; that a series of unfortunate events and circumstances could only ever lead someone to pornography. But this is simply not the reality for many porn performers. The workers rights’ and dignity of people in the porn industry should be protected, and face-swapping celebrities’ faces onto another woman’s body, as in the case of Gal Gadot, reduces women to less than their bodies. They are treated as mere vehicles for humour and technological fascination.

The problem is, what can be done to stop the use of computer programming in this way? Right now, very little. We tend to engage in dialogues of ‘what ifs’, but it is already happening, and has been happening since the freedoms of the internet became so widely available. Online spaces and software advancements always have both the potential for good, and the potential for bad, and it’s a balancing act we have had to reckon with as they have grown. A few tech-savvy Redditors wanting to impress other users with explicit content is difficult to police, and in a way is a good thing that they aren’t being policed. Moral intent is not a given, and as we roll deeper into the millenium we can expect our faith in humanity to be tested time and time again.

Subscribe to the Dazed newsletterGet the day on Dazed straight to your inbox