Skip to main content

Robots are learning navigation skills by being trained to hunt prey

robot hunt prey 13900953  male lion panthera leo
The newest king of the jungle? It’s not a big cat. It’s a robot. Scientists at the Institute of Neuroinformatics at the University of Zurich in Switzerland are teaching robots how to behave like predators and hunt their prey by way of a specialized camera and software that allows for deadly precision.

Don’t worry — the end goal isn’t actually predatory in nature. Rather, scientists say, they’re just trying to get robots to navigate their surroundings more accurately and efficiently, perhaps by finding a mark and following it around.

“Following [in large groups of self-driving cars or drones] is the obvious application, but one could imagine future luggage or shopping carts that follow you,” Tobi Delbruck, a professor at the Institute of Neuroinformatics, told Motherboard via email. “This way, the problem is less like a predator and its prey and more like herding, or a parent and child.”

But regardless of what you call it, the concept behind the new technology is decidedly, well, animalistic. The robot’s hardware is modeled after animal behavior, and depends largely on a silicon retina that behaves much like the human eye to quickly process visual data. A normal camera wouldn’t suffice in this situation because a slower frame rate doesn’t allow for the robot to see an actual movement path, especially if its “prey” (or whatever) is moving quickly.

The data obtained by way of the retina is then processed using a deep learning neural network that becomes “smarter” the more it’s used. Because in both the wild and in the lab, practice makes perfect.

So watch out, world. You may soon be followed around by robots who are watching you, quite literally, like a hawk. But whether or not that’s actually a bad thing has yet to be determined.

Editors' Recommendations

Lulu Chang
Former Digital Trends Contributor
Fascinated by the effects of technology on human interaction, Lulu believes that if her parents can use your new app…
Japanese researchers use deep learning A.I. to get driftwood robots moving
driftwood ai robots move mzmzmjaymw


Did you ever make sculptures out of found objects like driftwood? Researchers at the University of Tokyo have taken this same idea and applied it to robots. In doing so, they’ve figured out a way to take everyday natural objects like pieces of wood and get deep reinforcement learning algorithms to figure out how to make them move. Using just a few basic servos, they’ve opened up a whole new way of building robots -- and it’s pretty darn awesome.

Read more
This very talented robotic leg learned to walk all by itself
robot limb

A team of researchers at the USC Viterbi School of Engineering believe they have become the first to create an AI-controlled robotic limb driven by animal-like tendons that can even be tripped up and then recover within the time of the next footfall, a task for which the robot was never explicitly programmed to do. Photo by Matthew Lin. Matthew Lin

Robots suck at walking. Even the most advanced of their kind, like those developed by engineers at Boston Dynamics, have taken years to evolve beyond the sure-footedness of a toddler stumbling on an ice rink. And that’s when they’re programmed ahead of time to perform the task. But what if robots were created without the innate ability to walk? How might they find their footing?

Read more
FoosFit foosball trainer gives you a robot opponent to hone your skills against
foosball training kickstarter foosfit robotic tool

FoosFit on Kickstarter

Do you enjoy a game of foosball, but struggle to find someone to play with you? If that’s the case, you could be the perfect target audience for the new “FoosFit” training tool, which has just landed on Kickstarter. Providing you with a robot buddy to pit your skills against, FoosFit controls members of the opposing team on the tabletop field so you can hone your craft and practice the kind of shots you’ll one day use to obliterate real human players.

Read more