Skip to main content

Deepfake videos of U.K. leaders set chilling precedent for 2020 U.S. election

When a video of United Kingdom Prime Minister Boris Johnson emerged Tuesday morning, something about it seemed very odd.

The video starts out conventionally enough: It looks like Johnson is making his usual stump speech in the midst of the U.K.’s snap general election — he speaks about a divided country and high-running emotions over Brexit — until suddenly 20 seconds in, when Johnson endorses his rival, Jeremy Corbyn, the leader of the opposition Labour Party, for prime minister.

Related Videos

A similar video of Corbyn also surfaced, appearing to show the Labour leader endorsing Johnson for Prime Minister. Both videos quickly out themselves as deepfakes: Manipulated videos produced by the U.K.-based think tank Future Advocacy in collaboration with artist Bill Posters, who was also responsible for the recent Mark Zuckerberg and Kim Kardashian deepfake videos that went viral.

Boris Johnson has a message for you.#GE2019

— Future Advocacy (@FutureAdvocacy) November 12, 2019

Areeq Chowdhury, Future Advocacy’s head of think tank, told Digital Trends that the group is trying to “use shock and humor” to raise awareness of the deepfake problem during this election. “All democracies, all parts of society will be affected by this,” Chowdhury said. “Business, media, individuals; it’s a bit like climate change. It needs to be tackled collectively, and that just hasn’t happened.”

Speaking to Digital Trends, Posters said he’s long been interested in “interrogating different types of corporate and state propaganda in the public space.”

“I’m most concerned about the way propaganda was not only influencing the decisions we made online, but also influencing how people voted,” he said. “As an artist, it was really interesting to explore deepfake tech, and create moving images and video pieces to bring light to these obscure issues.”

As the U.K. grapples with the snap election and the long running Brexit controversy, the U.S. is staring down the long barrel of its 2020 presidential election, with the specter of the social media mess that was the 2016 presidential election still overshadowing much of everyday politics. Deepfake videos were in their infancy during the last cycle, but are poised to create an even bigger problem next year.

“Trump has already been caught retweeting Neo-Nazis,” cybersecurity expert Dr. Richard Forno told Digital Trends. “I think this will be a problem generated by him and his campaign.” Forno, the assistant director of the Center for Cybersecurity at the University of Maryland, Baltimore County, has 20 years of working in the cybersecurity sector behind him. He said he’s particularly concerned about what Future Advocacy has called the “liar’s dividend,” defined as when real footage of controversial content is dismissed as fake.

We've released deepfakes of Boris Johnson and Jeremy Corbyn today to raise awareness of the threats posed by unregulated technologies.

Find out more about the 4 key challenges we're highlighting at

— Future Advocacy (@FutureAdvocacy) November 12, 2019

“If you’re sowing enough confusion, and people don’t know what to believe, this will be a huge problem for western democracy,” said Forno, who added that computers and electronic data already define a huge slice of our reality. “When you go to the DMV to renew your license, and suddenly they tell you that you have three unpaid parking tickets, you might say, ‘no I don’t,’ but they’ll say, ‘yes you do, the computer says so!’ The computer is always right.”

Looking at the deepfakes that Future Advocacy produced, Forno said the videos were clever, and “the average person would probably believe them.”

In a press statement, Future Advocacy explicitly said the videos were a stunt, and called on “all political parties to work together to raise awareness of the dangers surrounding online disinformation.” It also points to what it calls “four key challenges” that need to be solved regarding deepfakes: How to detect deepfakes, dealing with the liar’s dividend, having effective regulation, and limiting damage.

“The responsibility for protecting democracy shouldn’t be outsourced to private companies in Silicon Valley,” Chowdhury said. “This is a problematic position that we’re in.” He pointed out that technology has changed so fast that the laws simply haven’t kept up. “Protecting democracy should be the priority of politicians, and they’re not stepping up at the moment.”

Editors' Recommendations

Ahead of the 2020 presidential election, Facebook says it’s banning deepfakes
Facebook Chairman and CEO Mark Zuckerberg testifies before the House Financial Services Committee on "An Examination of Facebook and Its Impact on the Financial Services and Housing Sectors" in the Rayburn House Office Building in Washington, DC on October 23, 2019.

Two days before Facebook is set to appear in front of lawmakers at a House Energy and Commerce hearing on manipulated media, the social network has announced it’s banning all forms of deepfakes. The announcement represents a significant step forward for Facebook, which has been struggling to mend its ailing image with the 2020 presidential election right around the corner.

In a blog post, Monika Bickert, Facebook’s vice president of global policy management, said the company will take down videos that have been "edited or synthesized in ways that aren’t apparent to an average person" or are the "product of artificial intelligence or machine learning that merges, replaces or superimposes content onto a video, making it appear to be authentic."

Read more
Twitter brings back Election Labels for candidate accounts for 2020 elections
Twitter Election Label

Twitter will be bringing back its "Election Label" feature for the 2020 races to help users distinguish between the accounts of various candidates on the social network.

According to a post published on Twitter’s company blog on December 12, the popular social network is not only verifying the campaign accounts of candidates who have qualified for the 2020 primary elections, but it will also be bringing back Election Labels, a secondary feature it employed in the 2018 U.S. midterm elections.

Read more
Prophet or puppet-master? Meet the man behind the Zuckerberg deepfake
Zuckerberg Deepfake

Bill Posters is a former street artist based in the U.K. You may not know his name, but if you’ve been paying attention, you’ve likely seen his work.

He’s the man behind the recent spat of viral deepfakes: videos that make it look like various celebrities and politicians are making outrageous statements that they never really made. The videos that Posters makes are intentionally fabricated to illustrate a political point, but indicate a growing online problem: in the example below, Posters and his collaborator created this video of Mark Zuckerberg and posted it on Instagram to test the company's policy of refusing to take down deepfake videos, he said.

Read more