Imagine if the ground could know if you’re walking on it.
Imagine if that patch of sidewalk could tell how much you weigh, how fast you’re going, and where you’re headed. Imagine that road could determine exactly how many people are walking or running or biking or skipping over it at any second.
What if the city could see all that information? Would you want to walk on that ground?
What if you didn’t have a choice?
Cities across the country are testing smart city tools to track their citizens and better optimize how they operate — and urban planners are having to balance these high-tech methods with concerns over privacy and mass surveillance.
“If a city puts these sensor embeds everywhere, how long do you think it would take before I could ID your particular footstep pattern?” asked James Ward, a data and privacy lawyer. “Not very long. Humans are creatures of habit and pattern.”
Cities are evolving toward an ever-more digital future, with surveillance cameras everywhere, facial recognition as a fact of life — and yes, even “smart sidewalks.” But who owns that data, where is it stored, and is it ethical to be collecting information about citizens that could be used to identify them?
Building a ‘better’ sidewalk
Those futuristic sidewalks are already here.
As of early March, smart ground is being piloted in some undisclosed locations in upstate New York and Connecticut, according to Jessica O. Matthews, CEO and founder of Uncharted Power.
Matthews told Digital Trends her company hopes to have smart ground installed across the country soon.
Her patches of smart ground measure 3 by 3 feet, and are outfitted with fiber-reinforced polymer sensors that Matthews says are easily upgradable.
The patches could be used for “smart and sustainable infrastructure development” in U.S. cities, Matthews said. The idea is to allow cities to better estimate, anticipate, and regulate traffic flows and patterns, both on the street and on the sidewalks, and thereby better understand what resources to deploy where.
Of course, one big question is, can this ground tell who, exactly, is walking across it?
No, Matthews said definitively.
“With our data, the best you could get is that there’s a trend at a certain time,” said Matthews. “Maybe there are five people walking in this direction. But you wouldn’t know who they were.”
An anonymized data set has rarely stood in the way of someone who wanted to find out information, however.
Privacy advocates say law enforcement, an intrepid sleuth, or a hacker who obtained Uncharted Power’s data likely wouldn’t be stopped. That data could be combined with security camera footage or with facial-recognition technology like Clearview AI to figure out, with reasonable accuracy, who you are and where you’re going.
“It’s not realistic to assume that you can’t pair footfall data with other information to ID people,” Ward told Digital Trends.
Matthews maintained that with her dataset alone, it would be impossible to say “this is this person.”
“If you marry this with existing cameras, though, that’s a separate thing,” she admitted. “People are already being watched, and all these datasets together could be used to create a minority report. But that’s a surveillance conversation, not a data-collection conversation. You can collect copious amounts of data without invading privacy.”
Is regulation needed?
It’s true that surveillance and data collection are two different — if related — things.
One can easily feed the other, however. “Re-identification of a supposedly anonymized data set is very simple,” Ward said. “All you need is a few data points, and you’ve got an ID. The question really is what safeguards can we have in place.”
Experts told Digital Trends that it would likely fall on the government to create these safeguards, and pointed to precedents like the General Data Protection Regulation (GDPR) in Europe or the California Consumer Privacy Act (CCPA) as a guide.
Private industry is too unreliable to come up with these guardrails themselves, experts said.
“I’ve been in the security industry for 23 years, and one thing I’ve learned is that you can never trust a user to do the right thing,” said James Carder, chief security office and vice president of LogRhythm, a security intelligence company.
“The American model of self-regulation is what has given us ‘information capitalism,’ which renders basic human activity into a commodity,” Ward added.
Concerns over widespread government tracking of civilians have come to the fore recently as public health experts and authorities attempt to track who among us may have had contact with a carrier of COVID-19.
“In the event of a global pandemic, having a pattern ID system in place is useful,” said Ward. “But the privacy implications depend on the existence of regulatory framework that demands developers do what’s best for individuals.”
Ward noted the EU’s GDPR has “huge carve-outs” for local governments to manage data for health and welfare. He also said there’s currently nothing like this in the U.S.
Greg Kahn, president and CEO of the Internet of Things Consortium, agreed that when it comes to a situation like COVID, there’s a huge trade-off between privacy, convenience, and security.
“In societies like China, where the government adopts technology and everyone has to abide by it without much of a say, data collection can be used to mitigate crime and contain disease,” Kahn told Digital Trends. “If there’s an Uber driver in Manhattan who’s been affected, should Uber or that individual have to give up information about all the passengers who have ridden with him? That would reveal a lot of information about him.”
Who owns the data?
Ted Lehr describes himself as a “data architect” for the city of Austin, Texas, where he is attempting to implement some of the smart city measures that entrepreneurs like Matthews are developing. Lehr said he’s trying to draw a line between effective and beneficial data collection and surveillance.
“We take it seriously in Austin, and we want to do things ethically,” he told Digital Trends. “Local government is set on trying to understand what residents want. The people I talk to here, they say they don’t have anything that’s like surveillance, and they don’t want to make all their data available to make money off of. But we do have a lot of open data.”
For example, Lehr said they were currently working on proposals to prevent GPS tracking of people who were in public buildings or spaces. But data is still being collected now and it’s not clear who owns or houses that information.
“If a car drives by a sensor, who owns the data?” Lehr asked. “Is it the carmaker? The person driving the car? The city the car was driving in? The company that made the sensors? Would they then try to sell that information or sell you ads based off public infrastructure?”
Protecting privacy information was something Austin was working on, but Lehr put the onus back on the private sector to deal with privacy issues.
Kahn agreed: “Today, in 2020, look to companies to do it … Coronavirus is the perfect example of this. In the U.S., it’s [been] left to local entities to make big decisions — should schools or museums be closed — and where folks are stepping up is on a business level.”
Even before the pandemic hit, one of the biggest privacy stories was the emergence of Clearview AI, a site marketing itself to law enforcement as surefire facial-recognition technology. Clearview was scraping social media sites for photos and information about people — a move that is against the Terms of Service of most of these sites — in order to build its database.
In response, two U.S. senators attempted to introduce some measures that would limit law enforcement’s use of facial recognition. The bill, called the “Ethical Use of Facial Recognition Act,” hasn’t gotten any traction in Washington.
But that doesn’t mean it won’t. An ExpressVPN poll in February found that 92% of Americans would delete an app that they used regularly if they found out it had sold their information to a third party.
It also found that “more than two-thirds (68%) of Americans are concerned with the growing use of facial-recognition technology and 78% with its potential abuses.”
Harold Li, ExpressVPN’s vice president, told us that as smart cities develop, privacy will be part of the conversation.
“Whether you can successfully achieve that goal is another question,” he said. “In theory, any data collection could indeed be anonymized in a way where it doesn’t impact individual privacy. But it depends on how that data is handled.”
Which raises the question of how to handle the data so that data collection doesn’t cross over into outright surveillance.
Matthews — who is a woman of color — said one step could be for companies and cities to include women and people of color, who may be more aware of where the digital privacy line lies, at the table with other decision-makers.
“It’s unavoidable that surveillance tech is deployed disproportionately against people of color,” Ward agreed.
“It would be nice to have a collaboration where people from different industries can see how this can be made as equitably as possible,” Matthews said. “It’s exciting to have a system that will have the benefits of data collection, but we want to scale in a way that’s intentional and thoughtful.”
- A Black woman invented home security systems. Big Tech gave them racial bias
- Smart home technology needs to be more private to handle personal health
- The most hack-proof home security cameras
- How to find hidden cameras in your Airbnb rental
- Google has an ingenious plan to kill cookies — but there’s one big drawback