Bill Posters discusses the mission behind his deepfake videos, where each prime ministerial candidate bizarrely endorses the other
Yesterday, a video emerged which shows what we’ve all been hoping for the past two weeks: Boris Johnson endorsing Jeremy Corbyn for prime minister. Before you gleefully throw your arms in the air, there’s a catch – the clip is actually a deepfake warning about the dangers of disinformation.
Created by tech think tank Future Advocacy, the doctored videos – there’s also one of Corbyn endorsing Johnson, though his voice isn’t quite as convincing – are eerily accurate and show how easily deepfakes can be perceived as truth.
“Hi folks,” opens Johnson’s deepfake, “I am here with a very special message. I wish to rise above this divide and endorse my worthy opponent, the Right Honourable Jeremy Corbyn to be prime minister of our United Kingdom. Only he, not I, can make Britain great again.” It’s the stuff of both dreams and nightmares.
The deepfakes were made over several months and were created using a form of artificially creative video called video dialogue replacement. The scripts were written from existing footage of Johnson and Corbyn, which was then digitally manipulated to fit new voice recordings made by impressionists. At the end of Johnson’s video, the deepfake says: “I am a fake. A deepfake to be precise. As you can see, even I, the prime minister, can be affected by them.”
Boris Johnson has a message for you.#GE2019pic.twitter.com/ST4dXPbRYE
— Future Advocacy (@FutureAdvocacy) November 12, 2019
Bill Posters, the artist behind the videos, says he created the clips in order to raise awareness about the lack of action taken by politicians when it comes to combatting disinformation online. “It’s quite clear that electoral politicians and representatives haven’t done enough to safeguard the integrity of your online privacy,” he tells Dazed. “What we’re getting at with this campaign is to really put the onus on those elected representatives to put the lid back onto the black box techniques that are running rampant online.”
Although Future Advocacy’s videos are satirical and humorous – and reveal themselves to be fake – they raise serious questions about what we can believe online. “Over the last three years, there’s been multiple parliamentary inquiries into fake news and the ways in which personal data of British citizens has been used as part of online influence campaigns,” Posters explains. “But as far as we can tell, none of the (subsequent) recommendations have been implemented into law.”
Following the 2018 Cambridge Analytica scandal – in which political campaigns gathered information about Facebook users and then targeted them with propaganda – and illegal activity by the Vote Leave campaign during the 2016 EU referendum, there’s been a suffocating air of mistrust surrounding both news and social media outlets, as well as people in power. “What can we trust when we can’t believe our eyes and our ears?” Posters questions.
We've released deepfakes of Boris Johnson and Jeremy Corbyn today to raise awareness of the threats posed by unregulated technologies.
— Future Advocacy (@FutureAdvocacy) November 12, 2019
Find out more about the 4 key challenges we're highlighting at https://t.co/BilCk4rVugpic.twitter.com/mUlarnQmRW
“Politicians and political ideologies have targeted and purposefully eroded the trust in journalism,” he continues. “Our ways of understanding the world around us and our relationship to others have been systemically decimated by various factors. Deepfakes really give form to a lot of the deeper mistrust and fear that people have around the way power operates, and the way personal data can be used in unexpected ways.”
Political misinformation has been at the forefront in recent weeks, with Mark Zuckerberg confirming that Facebook won’t ban political ads ahead of the UK’s general election, despite previously revealing that the platform doesn’t fact-check the claims made in them. The irony of ex-Lib Dem leader Nick Clegg – now the company’s VP of global affairs and communications – being the speakerbox for this decision is not on lost on anyone, especially as he previously described the deceit during Brexit as the “biggest con trick in politics”. While Facebook’s policy clearly futhers the spread of false information and potentially dangerous propaganda, Posters – who was also behind the recent Zuckerberg deepfake – believes this means politicians are unlikely to create deepfake footage of their rivals. “(Facebook is) already facilitating them to be allowed to create targeted forms of misinformation campaigns,” he states, “so deepfakes in this context are kind of redundant.”
Future Advocacy’s videos come a week after the Tories were accused of editing footage of a senior Labour figure, and just days after the BBC was accused of covering up a mistake made by the prime minister at this year’s Remembrance Sunday ceremony. The broadcaster aired archive footage of Johnson laying a wreath at the 2016 service, as opposed to this year’s clip in which he placed his flower arrangement upside down. “There’s a lot of tension around the way in which impartiality is being presented and acted on at BBC News platforms,” Posters tells Dazed. “It’s very untransparent in many ways.”
“Politicians and political ideologies have targeted and purposefully eroded the trust in journalism. Our ways of understanding the world around us and our relationship to others have been systemically decimated” – Bill Posters
So how can the public begin to identify what’s real vs what’s fake? “There’s some great AI start-ups that are trying to produce automated systems which can detect manipulated video content,” Posters reveals, “but it’s also important to critically analyse the type of video content you’re seeing.”
Although his recent work has focused on politicians, Posters asserts that we need to be looking at the broader dangers of deepfake technology. “It’s really important that we zoom out and look objectively at how deepfakes are being used culturally,” he says. “An overwhelming use of deepfake technology is in the porn industry. What we really need to be focussing on is the way women’s bodies and identity and privacy are being exploited.” Recent research revealed that 96 per cent of all deepfake videos feature women, most of whom are being edited into porn without their consent.
Posters hopes that his new election videos will push politicians to urgently address the dangers of misinformation, leading to a crackdown on non-viral clips – like celebrity porn videos – as well as those which go mainstream. The artist concludes: “These are real issues that electoral representatives need to get a handle on.”