No one seems to be able to figure out if what Clearview AI is doing is legal, a quandary that has exposed the messy patchwork of laws that allow exploitation of people’s personal data to profligate.
As reported first by CNET, Google and YouTube recently sent cease-and-desist letters to Clearview — the controversial law enforcement facial recognition technology — over its scraping of their sites for people’s information. Twitter sent one in January, while Facebook is said to be reviewing Clearview’s practices.
Meanwhile, Clearview AI founder Hoan Ton-That has made a claim that it’s his First Amendment right to collect public photos and use them in the way he has. He told CBS News in an interview that, essentially, spilling the data of millions of people to the police is totally legal. Scholars say that is both true and not true. What is clear is this technology isn’t going away, and these legal questions will need to sort themselves out somehow, and soon.
“There have been arguments that there is a First Amendment right to access publicly available information on the internet and scrape it,” said Matthew Kugler, associate professor at Northwestern University. But what happens to that data after you collect it is not necessarily protected.
“We don’t have a binding answer to that,” Kugler told Digital Trends. “It’s within the realm of plausible argument, but Clearview is on shaky ground if they want to say ‘I have this data and I can do whatever I want with it.’”
“It’s not surprising that the CEO of Clearview AI is weaponizing the First Amendment to justify his company’s practice of scraping online photos of people’s faces without their consent,” said Evan Selinger, a professor of philosophy and technology at the Rochester Institute of Technology. “What’s required to properly govern Clearview AI and so much else is a thoroughgoing reexamination of what private and public mean that shifts the key debates in law, ethics, design, and even everyday expectations. In short, it’s long overdue to acknowledge that people can have legitimate privacy interests in publicly shared information.”
A patchwork of laws
As of this writing, there are few, if any, federal statutes governing online privacy. What exists instead is a patchwork of state laws, including those of Virginia, Illinois — under whose law Facebook recently lost a $500 million lawsuit over its photo tagging and facial identification features — and California, which just recently enacted the strongest law so far in the union.
“I have a First Amendment right to expose the private data of millions of people” sounds crazy, but it’s legally tenable.
HiQ used a similar argument in its CFAA case vs. LinkedIn. https://t.co/u3IZMGyeyW
— Tiffany C. Li (@tiffanycli) February 4, 2020
A similar situation was litigated recently when LinkedIn sent a cease-and-desist to the startup hiQ, which sold data on people to their employers based on what hiQ could scrape from LinkedIn. The reasoning was, theoretically, to help businesses keep track of their workforces, Reuters reported at the time.
The case ran up against the same privacy versus tech issues now facing Clearview AI: Is it fair to take publicly available data, store it, repackage it, process it, and sell it, or is that a violation of privacy?
Basically, we haven’t yet figured that out. The LinkedIn case ultimately found that scraping is protected and legal, and hiQ is still in business. But, as Albert Gidari put it, this particular case with Clearview raises other tough issues, so it’s unlikely there will be a clear resolution regarding this conduct in the short term.
Gidari, the consulting director of privacy at the Center for Internet and Society at Stanford University, told Digital Trends via email that although Clearview asserts its right to scrape and use photos, “individuals also have a statutory right to their image.” For example: California prohibits unauthorized use of a person’s voice, image, or name for someone else’s benefit (which is clearly what Clearview does). “There is no doubt that these photos fall under the Illinois statute and probably under CCPA [the California law] as biometric use without consent,” he wrote in an email.
However, Clearview also asserts that its use of images is in the public interest, which could be allowed if a judge were to find that argument convincing.
The consequences of eating too much cake
Chris Kennedy, the chief information and security officer for cybersecurity firm AttackIQ, told Digital Trends that these are all signs of a reckoning between the information buffet that we’ve been enjoying and the privacy ramifications we will soon have to face. “We live in an age of a growing distrust for technology,” he told Digital Trends. “The last 20 years, we’ve had our cake and ate it, too. We had free sharing of information, we enabled e-commerce, and now it’s just become the expectation that you’ll put yourself out there on the Internet. We paying the price now.”
Basically, Kennedy says, all the goodwill that was built up over the early years of the Internet, when people were having their informational cake and eating it as well, is starting to erode, partly because there are no clear rules to follow, and therefore no clear expectations as to what will happen to you on the Internet. That needs to change, he said.
Kennedy is certain we’re moving in a very pro-Clearview AI-direction; that is, the facial recognition genie is out of the technological bottle, and there’s no way to reel it back in.
“We can’t slow the pace of technology without significant cultural shifts … and enforceable laws,” he told Digital Trends. “It can’t be this toe in the water stuff like CCPA or GDPR [the name for European digital privacy laws]. It has to be, ‘this is how it is, these are the expectations in the management of your data and information, you must adhere to them or risk the consequences.’ It’s like when a hurricane comes. You leave, or you pay.”
- The future of smart cities may mean the death of privacy
- Pandemic drone fails to get off the ground in Connecticut
- Trump’s executive order would hamstring U.S. innovation
- Ring may want to add facial recognition and more to cameras
- This creepy quiz tells you which government agencies may have a photo of your face