Deepfakes have finally gone mainstream with the Chinese app Zao, which spiked on the iOS App Store over the weekend. The app uses facial recognition to replace celebrities’ faces with yours in video clips, but it’s raising concerns about how your face is being used and who has access to deepfake technology.
The app works like this: by uploading one of your selfies to the app, it can place your face on top of celebrities like Leonardo DiCaprio, making it seem like you starred in the Titanic. The app essentially lets anyone create a deepfake, or a fake video that seems incredibly real.
Here’s an example of what the app does:
In case you haven't heard, #ZAO is a Chinese app which completely blew up since Friday. Best application of 'Deepfake'-style AI facial replacement I've ever seen.
Here's an example of me as DiCaprio (generated in under 8 secs from that one photo in the thumbnail) ???? pic.twitter.com/1RpnJJ3wgT
— Allan Xia (@AllanXia) September 1, 2019
The app debuted in China’s iOS app store on Friday, Aug. 30. A firm that tracks app downloads called App Annie said that Zao was the most-downloaded free app in China’s App Store as of Sept. 1, according to Reuters. It’s currently only available in China, but we’ve reached out to Zao to see if they plan a U.S. release.
ZAO’s popularity has made some wonder if we should be wary of how accessible deepfake technology has become. There’s already been issues with fake deepfakes purporting to show famous people saying something they never actually said, including one deepfake of Mark Zuckerberg proclaiming his power over “millions of people’s stolen data.”
However, experts say that it’s important not to rush to conclusions when it comes to deepfakes.
“I think Zao is a symbolic representation that this technology is becoming more and more widespread and easy to use,” Henry Ajder, Head of Communications & Research Analysis at Deeptrace, a company building a platform to detect deepfakes, told Digital Trends. “I think it’s just very important to remain vigilant and not to essentially create sensational responses which could do more harm than good in the long run.”
As more users download the app, recent reports caution that they should be wary of the ZAO’s privacy terms.
Reuters reports that ZAO’s user agreement states that those who use the app “agree to surrender the intellectual property rights to their face, and permit ZAO to use their images for marketing purposes.” In uploading your photo, the app — which was published by Momo, a Chinese social networking company — can use your photo in any way it so chooses.
ZAO has reportedly already responded to user concerns about the terms of agreement.
“We thoroughly understand the anxiety people have towards privacy concerns,” Zao said on the Chinese social network Weibo. “We have received the questions you have sent us. We will correct the areas we have not considered and require some time.”
Ajder said that apps like Zao and FaceApp have a limited functionality. For example, you have limited video clip options to choose from in the Zao app and cannot upload your own video. He said that while the technology for these apps is very impressive, it’s also limited, and is largely a novelty to gain popularity for the app.
In the event of if this technology was translated in a way where someone could upload their own video, Ajder said then it could become a concern.
“The danger would be if someone created a dedicated website or a kind of independent source that went similarly viral,” he said, adding that something like that would be the most significant development with deepfakes so far.
“We are moving ever closer towards a situation that at low resources and high accessibility can cause damage,” Ajder said.
- New phishing method looks just like the real thing, but it steals your passwords
- New malware can steal your credit card details — and it’s spreading fast
- This new Windows 11 feature will help you protect your passwords
- Spellcheckers in Google Chrome could expose your passwords
- This Microsoft Teams exploit could leave your account vulnerable