‘Deepfakes’ are being weaponised to silence women – as the technology accelerates, can our lawmakers move fast enough to protect us?
What does the advancement of AI mean for the future of the arts, music, and fashion? Will robots come for our creative industries? Could a machine ever dream like a human can? This week on Dazed, with our new campaign AGE OF AI, we’re aiming to find out.
To this day, Noelle Martin doesn’t know who did it – or why. For the last six years, pornographic pictures of the 24-year-old Australian have been flooding the internet. In the last year, these photos have been accompanied by videos. There’s one of Noelle having sexual intercourse, and another of her performing oral sex. In both of these clips, Noelle is looking straight at the camera, her facial features clearly visible and identifiable. Yet neither of these videos are real.
“Never in my wildest imagination did I think that this (was) something people did or I could ever be a target of something like this,” says Noelle, who is a victim of deepfake technology. A portmanteau of “deep learning” and “fake”, deepfakes are faceswap videos generated by AI, usually via the program FakeApp.
When the technology was first popularised in late 2017, many speculated it could cause political upheaval. So far, however, the deepfakes online are predominantly of celebrities edited into pornographic videos, and new fears have arisen that easily accessible deepfake apps could be used to generate revenge porn.
“They have done it of me, and I am not a celebrity, I am literally just an ordinary woman,” Noelle says. “You can have a photo of yourself on LinkedIn, and if someone wants to misrepresent your whole life… your employability, your reputation… someone can go and ruin your whole life with absolute ease.”
Rana Ayyub is a 34-year-old Indian journalist, and another victim of deepfakes. “I didn’t know how to respond the moment I saw the video. It was sent to me by a professional contact and I could not go beyond three frames,” she says of a deepfake of her that was spread this April. “I was shivering, I threw up.” That evening, Rana was taken to hospital for heart palpitations.
“My reaction for the first two days was to just cry. Screenshots of the video trickled on my phone every minute, on my WhatsApp, Twitter timeline, Facebook inbox. By the next day it was on my father’s phone, my brother’s.”
Rana’s abuse started because she criticised the prime minister of Indi, Narenda Modi, for failing to speak out about violence towards India’s lower-caste groups. While Noelle does not know who created the Photoshopped pornographic pictures of her six years ago, she knows the deepfakes were created in response to her speaking out publicly about this initial harassment (she did a TED talk in November 2017).
Harry Eccles-Williams is a lawyer involved in Queen Mary’s revenge porn legal advice service SPITE (Sharing and Publishing Images To Embarrass). He says they haven’t yet dealt with any cases of deepfakes, but have seen a few instances of lower-tech Photoshops. While headlines have spread fears that deepfakes will be created by scorned lovers seeking revenge on their exes in the future, at present this new type of revenge porn appears to be used to humiliate politically active woman. Welsh Tory politician Janet Finch-Saunders has spoken out about being a victim; the tags on faked videos of Noelle include the word “feminist”.
Gary Broadfield, a partner and cybercrime specialist at Barnfather Solicitors, says as far as he is aware, there haven’t been any prosecutions for deepfakes either in the UK or US. He explains that while at first glance the UK’s revenge porn law – section 33 of the Criminal Justice and Courts Act 2015 – seemingly could legislate against faked porn, sections 34 and 35 specify that an image created by combining images isn’t considered “private and sexual” under the act.
“Frankly, I am surprised by this,” Broadfield says. “Fundamentally, it looks like the criminal law doesn’t offer any reliable protection for victims of this kind of activity. Whilst I am not generally in favour of creating new law to criminalise new behaviour – largely because in most instances, the existing law usually already covers it perfectly well – it does appear that there is a gap here that needs to be closed.”
Rana filed a complaint with the police three days after she first saw her deepfake video, but says the police refused to make a First Information Report (a document written by authorities in India when a crime is first reported). Rana's lawyer had to threaten the police before they created a FIR, but despite Twitter and Facebook writing to the police to assure they would cooperate in an investigation, no action has yet been taken. It has been four and a half months.
“I’m enraged that people have the audacity to do this to someone knowing that it’s a complete violation of their agency, dignity and humanity” – Noelle Martin
There is no law in India specifically against revenge porn, and lawyers claim that the law that could be used – the Information Technology Act 2000 – could also potentially see victims punished.
Meanwhile, Noelle says the Australian authorities told her to contact the webmasters of sites hosting her faked photos and videos and ask them to be deleted. “I spent years sending emails,” she says. “There was one occasion where the perpetrator said he’d only delete the site if I sent him nude photos of myself within 24 hours.” Because websites hosting Noelle’s videos were often overseas, there was nothing the Australian legal system could do. Noelle ended up becoming involved in law reform that criminalised revenge porn in Australia – the new laws, passed this February, included provisions for deepfakes.
Even if the right laws existed across the world, however, Broadfield and Eccles-Williams note they would be very difficult to enforce. “The people that are creating deepfakes are smart enough, computer-wise, to be able to hide who they are, unless you get half of GCHQ working for a day on it,” SPITE’s Eccles-Williams says.
In the meantime, Broadfield advises that the EU’s “right to be forgotten” could protect victims in the UK. Under GDPR regulations, individuals can apply for certain information about themselves to be erased, and Google complies. This means a victim of deepfakes in the EU could have the videos removed from search results of their name – though crucially, the videos themselves wouldn’t be deleted.
“Whilst this removes access to the video if it is successful, it doesn't really have any effect on the individual who created it,” adds Broadfield. “There isn't really a deterrent/punishment effect.”
Despite her success changing Australian law, Noelle is similarly frustrated that her abusers go unpunished. “They still think that they have the power to do this,” she says of the deepfakes. “I’m enraged that people have the audacity to do this to someone knowing that it’s a complete violation of their agency, dignity and humanity.
“They are doing it, I believe, because they can get away with it.”