Pin It
Katy Perry Deepfake

The crackdown on AI-assisted fake porn is starting

Reddit and Gfycat have begun taking down hardcore videos that feature the faces of celebrities mapped by neural networks

It was widely reported at the end of last year that people were using AI programs to map another person’s face – mostly celebrities, sometimes exes – onto porn stars without the women’s consent. Many look disturbingly life-like. 

Fake porn has been a problem for years, but it’s only with more recent innovations that it’s become sophisticated enough to cause widespread concern. A subreddit titled ‘Deepfakes’ has been one of the most thriving spots for AI-assisted fake porn. Some of the first videos were created using a machine-learning algorithm and open-source codes that were simple to walk through. One user went further and created and released an app that made it easier for others to create their own clips. Since then, women such as Scarlett Johnansson, Katy Perry, and more have had their images misused in hardcore porn footage.

As Gizmodo reports, some of the biggest sites hosting the porn videos and GIFs had removed content. “I just noticed that (…) my upload yesterday was deleted, could be a copyright issue,” on Redditor wrote.

Another added: “I don’t think it’s copyright since random Japanese idols with little to minimal presence in the West have been removed.”

Reddit users highlighted that “botched” ones had been deleted from the host sites, and another called it a “targeted purge”.

In a statement given to Dazed, a representative for Gfycat said: “Our terms of service allow us to remove content we find objectionable. We find this content objectionable and are actively removing it from our platform.”

The GIF-hosting site’s terms of service doesn’t directly prohibit fake porn or revenge porn but it bans “unlawful, harmful, threatening, abusive, harassing, tortious, excessively violent, defamatory, vulgar, obscene, libelous, invasive of another’s privacy, hateful racially, ethnically or otherwise objectionable.”

Reddit’s terms of service detail its blanket ban on “the posting of photographs, videos, or digital images of any person in a state of nudity or engaged in any act of sexual conduct, taken or posted without their permission.” While not explicit, both sites could lawfully remove content under their copyright infringement policies too.

One of the major problems in combatting the issue is that no law accurately covers fake porn videos. It’s also difficult to flag the content as a violation of privacy, as the faces the bodies are mapped onto are not the person’s own.

NextWeb details that Redditors on Deepfake are intent on supporting the content. “The work that we create here in this community is not with malicious intent,” the user wrote. “Quite the opposite. We are painting with revolutionary, experimental technology, one that could quite possibly shape the future of media and creative design. This technology is very new. And new scares people. A lot.” 

The thread is already working on other platforms to host the fake porn on, some directing users to Russian sites like ok.ru.

As reported first by Motherboard, artificial intelligence researcher Alex Champandard said: “This is no longer rocket science.” The user-friendly FakeApp, though an arduous process at eight to 12 hours of processing time for one brief clip, has made it simpler to mass produce the neural network-generated fake porn. The original subreddit has, in a short period of time, seen over 15,000 people subscribe. How terrifying.