Skip to main content

How can you tell how affluent an area is? Penny does it from space

penny machine learning income predictor 30619164  space satellite orbiting the earth
Andrey Armyagov/123RF
It’s not just the Great Wall of China or the Great Pyramids of Giza that you can see from space — thanks to artificial intelligence and Penny, we can now discern things that seem … imperceptible, too.

While you may not be able to tell how much money the denizens of a neighborhood simply by walking through it, you may be able to do so by flying over said neighborhood — that is, if you’re flying in a satellite. It’s all thanks to a new venture from satellite mapping company DigitalGlobe, who partnered with San Francisco design studio Stamen Design in order to create a machine learning-powered mapping tool that combines income data and satellite imagery to estimate the average income of neighborhoods.

The program is called Penny, and it takes into consideration the shapes, colors, and lines that comprise a satellite image, explains Fast Company. Then, by comparing this analysis to analogous Census data, the program examines images for observable patterns between urban features and income levels. From there, Penny “learns” what affluence looks like from above, and its algorithm can predict what income level it’s looking at.

So what are the telltale signs that Penny has identified thus far? According to the project’s website, “Different types of objects and shapes seem to be highly correlated with different income levels. For example, lower income areas tend to have baseball diamonds, parking lots, and large similarly shaped buildings (such as housing projects).”

Conversely, in higher income areas, satellite imagery shows greener spaces, tall buildings, and single family homes featuring backyards. In between the two are the middle-income areas, in which a smaller number of single family homes can be observed, alongside apartment buildings.

Perhaps one of the coolest features of Penny is that it allows you to manipulate various areas to add elements, thereby determining how the addition of the Empire State Building in an otherwise low-income area would affect the predictions as a whole. “Every feature that you add affects Penny’s predictions differently. The same feature will have a different effect depending on where it’s placed,” the team explained.

Already, Penny has proven itself to be quite accurate — if you examine the New York area, Penny is 86 percent sure that the Financial District is a high-income area and 99 percent sure that Harlem is a low-income neighborhood. So start hovering, you may be surprised by what you learn.

Editors' Recommendations

Lulu Chang
Former Digital Trends Contributor
Fascinated by the effects of technology on human interaction, Lulu believes that if her parents can use your new app…
Clever new A.I. system promises to train your dog while you’re away from home
finding rover facial recognition app dog face big eyes

One of the few good things about lockdown and working from home has been having more time to spend with pets. But when the world returns to normal, people are going to go back to the office, and in some cases that means leaving dogs at home for a large part of the day, hopefully with someone coming into your house to let them out at the midday point.

What if it was possible for an A.I. device, like a next-generation Amazon Echo, to give your pooch a dog-training class while you were away? That’s the basis for a project carried out by researchers at Colorado State University. Initially spotted by Chris Stokel-Walker, author of YouTubers:How YouTube Shook Up TV and Created a New Generation of Stars, and reported by New Scientist, the work involves a prototype device that’s able to give out canine commands, check to see if they’re being obeyed, and then provide a treat as a reward when they are.

Read more
This is how Google’s internet-serving Loon balloons can float for nearly a year
google launches project loon in sri lanka balloon

Only Google could think that the way to improve the flight of giant, helium-filled balloons is by coming up with better algorithms. And to be fair to the Mountain View-based search leviathan, it seems to have worked.

For the past couple of years, Project Loon, a subsidiary of Google’s parent company Alphabet, has been working to provide internet access in rural and remote parts of the world by using high-altitude balloons in the stratosphere to create aerial wireless networks. Last year, Loon announced that it had reached 1 million hours of stratospheric flight with its combined balloon fleet. Then, at the end of October, Loon set a new record for longest stratospheric flight by remaining airborne for a whopping 312 days, covering a distance of some 135,000 miles.

Read more
To build a lifelike robotic hand, we first have to build a better robotic brain
Robot arm gripper

Our hands are like a bridge between the intentions laid out by the brain and the physical world, carrying out our wishes by letting us turn thoughts into actions. If robots are going to truly live up to their potential when it comes to interaction, it’s crucial that they therefore have some similar instrument at their disposal.

We know that roboticists are building some astonishingly intricate robot hands already. But they also need the smarts to control them -- being capable of properly gripping objects both according to their shape and their hardness or softness. You don’t want your future robot co-worker to crush your hand into gory mush when it shakes hands with you on its first day in the office.

Read more