Skip to main content

California is cracking down on deepfakes for politics and porn

California is cracking down on deepfakes in both porn and politics. California Governor Gavin Newsom recently signed two bills: One (AB 730) which makes it illegal to create or distribute altered video, audio or still photography of candidates, and the other (AB 602) which lets California residents take legal action against people who create pornographic material non-consensually using their likeness.

A combination of “deep learning” and “fake,” deepfake technology lets anyone create altered materials in which photographic elements are convincingly superimposed onto other pictures. There is also a growing number of attempts to use related technologies to create fake audio that sounds as if it was spoken by a real person, such as a politician or celebrity. Although all of these tools can be used for innocent purposes (for instance, placing Sylvester Stallone’s face onto Arnold Schwarzenegger’s body in Terminator 2), people have rightly expressed concern about the way this technology can be used maliciously. The uses covered by the two California bills could do everything from damage reputations to, in the case of politically oriented deepfakes, potentially sway elections.

“In the context of elections, the ability to attribute speech or conduct to a candidate that is false — that never happened — makes deepfake technology a powerful and dangerous new tool in the arsenal of those who want to wage misinformation campaigns to confuse voters,” California assemblyman Marc Berman said in a statement. Berman authored the political deepfake bill AB 730.

The political bill covers only deepfakes which are distributed 60 days before an election. It also does not cover deepfakes which are obvious satire, since these are covered by free speech laws. Last month, Texas made waves as the first state to sign a law that makes it a misdemeanor to create and share distorted videos of politicians one month before elections.

Pornography is another big problem when it comes to deepfakes. A study by cybersecurity firm Deeptrace claims that 96% of 14,678 deepfake videos it identified online were pornographic. All of these used the likenesses of women, such as actresses and musicians.

Unfortunately, simply signing this into law isn’t going to guarantee the end of deepfakes created by bad actors. For that, a combination of enforcement of laws and more sophisticated tools for spotting deepfakes will be needed. But this is a first step that will hopefully help avoid an incredibly damaging use of sophisticated A.I. technology.

Editors' Recommendations

Luke Dormehl
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
The best deepfakes on the web: Baby Elon, Ryan Reynolds Wonka, and beyond
Ryan Reynolds as Willy Wonka Deepfake from NextFace

Deepfakes, the A.I.-aided face-swapping technology that threatens the future of truth as we know it, are everywhere. But while some of the potential applications are pretty darn unnerving, some are just plain fun as well.

Ever since the tech first burst onto the scene, a burgeoning community of deepfake creators has assembled online. Due to the controversial nature of the technology, many of these creators weren’t willing to share their real names. But share their work and thoughts on said work? That’s another thing entirely.

Read more
Why tech companies are ill-equipped to combat the internet’s deepfake problem
A deepfake of Mark Zuckerberg

How do you solve a problem like deepfake? It’s a question that everyone from tech companies to politicians are having to ask with the advent of new, increasingly accessible tools that allow for the creation of A.I. manipulated videos in which people’s likenesses are reappropriated in once unimaginable ways.

Such videos are sometimes created for satirical or sometimes darkly comedic purposes. Earlier this year, a deepfake video showed CEO Mark Zuckerberg gleefully boasting about his ownership of user data. A PSA about fake news, ventriloquized by Jordan Peele, meanwhile depicted Barack Obama calling his Presidential successor a “total and complete dipshit.” With the 2020 Presidential elections looming on the horizon, there’s more concern than ever about how deepfakes could be abused to help spread mistruth.

Read more
Deepfake-hunting A.I. could help strike back against the threat of fake news
Kim Kardashian Deepfake Interview Image

Sylvester Stallone deepfake (replacing Arnold Schwarzenegger in Terminator 2: Judgement Day) Ctrl Shift Face/Youtube

Of all the A.I. tools to have emerged in recent years, very few have generated as much concern as deepfakes. A combination of “deep learning” and “fake,” deepfake technology allows anyone to create images or videos in which photographic elements are convincingly superimposed onto other pictures. While some of the ways this tech has been showcased have been for entertainment (think superimposing Sylvester Stallone’s face onto Arnie’s body in Terminator 2), other use-cases have been more alarming. Deepfakes make possible everything from traumatizing and reputation-ruining “revenge porn” to misleading fake news.

Read more