It’s not just the Great Wall of China or the Great Pyramids of Giza that you can see from space — thanks to artificial intelligence and Penny, we can now discern things that seem … imperceptible, too.
While you may not be able to tell how much money the denizens of a neighborhood simply by walking through it, you may be able to do so by flying over said neighborhood — that is, if you’re flying in a satellite. It’s all thanks to a new venture from satellite mapping company DigitalGlobe, who partnered with San Francisco design studio Stamen Design in order to create a machine learning-powered mapping tool that combines income data and satellite imagery to estimate the average income of neighborhoods.
The program is called Penny, and it takes into consideration the shapes, colors, and lines that comprise a satellite image, explains Fast Company. Then, by comparing this analysis to analogous Census data, the program examines images for observable patterns between urban features and income levels. From there, Penny “learns” what affluence looks like from above, and its algorithm can predict what income level it’s looking at.
So what are the telltale signs that Penny has identified thus far? According to the project’s website, “Different types of objects and shapes seem to be highly correlated with different income levels. For example, lower income areas tend to have baseball diamonds, parking lots, and large similarly shaped buildings (such as housing projects).”
Conversely, in higher income areas, satellite imagery shows greener spaces, tall buildings, and single family homes featuring backyards. In between the two are the middle-income areas, in which a smaller number of single family homes can be observed, alongside apartment buildings.
Perhaps one of the coolest features of Penny is that it allows you to manipulate various areas to add elements, thereby determining how the addition of the Empire State Building in an otherwise low-income area would affect the predictions as a whole. “Every feature that you add affects Penny’s predictions differently. The same feature will have a different effect depending on where it’s placed,” the team explained.
Already, Penny has proven itself to be quite accurate — if you examine the New York area, Penny is 86 percent sure that the Financial District is a high-income area and 99 percent sure that Harlem is a low-income neighborhood. So start hovering, you may be surprised by what you learn.
- How will we know when an AI actually becomes sentient?
- MIT says it can tell how healthy you are by analyzing your dishwasher
- Artificial intelligence can now identify a bird just by looking at a photo
- Don’t speak: This wearable lets you give voice commands without saying a word
- What’s that liquid? IBM’s flavor-identifying ‘e-tongue’ will tell you