Home > Mobile > Bots_alive uses your smartphone to drive…

Bots_alive uses your smartphone to drive artificially intelligent spider robots

Why it matters to you

The artificial intelligence research behind bots_alive promises to usher in lifelike robots at affordable price points.

Artificial intelligence is all the rage in robotics these days, and for good reason: Properly implemented, it has the potential to program ‘bots on the fly. That’s the promise behind Cozmo, the AI-powered robot from Anki. And it’s the conceit of the Professor Einstein, the intelligent toy from Hanson Robotics.

But those toys and others react in predictable ways to changing contexts and situations. One startup, though, purports to have developed an algorithm capable of generating entirely new behaviors dynamically.

It’s called bots_alive, and it’s the brainchild of Brad Knox. Knox, who completed a dissertation in artificial intelligence at the University of Austin, worked with the Personal Robotics Group at MIT’s Media Lab on “Learning from the Wizard,” a project in which a robot learns to emulate its puppeteer’s control. It’s research that informed the development of bots_alive, a low-cost AI robotics platform.

More: Professor Einstein is a cute little robot that will teach you about physics

The impetus, Knox said, was to design a robot that behaved in a personable, human-like way. “We all want robots we can interact with, but there aren’t any products on the market that come close,” he said. “It came out of conversations about complex AI in research. We wanted to make something that’s valuable now — deliver on the promise of machine learning, given the limitations of current technology.”

Bots_alive

The novelty of bots_alive lies in the way it interacts with its surroundings. AI programmers typically give robots personalities with decision trees, Knox explained, dictating the rules by which they abide when behaving in certain ways. But true artificial intelligence of the kind embodied by bots_alive is entirely free form. “We don’t always know what the robot will do,” he said.

More: Slacker hacker: Programmer uses AI to disguise his screen when his boss nears

It requires a bit of human guidance, initially. An “improviser” operates the robot over a long period of time, generating data in what Knox calls “puppet sessions.” From that data, the bots_alive machine learning algorithm generates a model, assigning probabilities to outcomes. The end result, Knox said, is “lifelike authenticity” — a robot personality that reacts subtly but differently to changing environmental conditions.

It’s alive!

Knox demonstrated the technology’s potential during a Skype conversation. He placed the robot near a handful of blue blocks and red blocks, and defined two simple rules: The robot was to move toward blue blocks and perceive red blocks as barriers.

First, he placed a blue block in the center of the robot’s vision. It moved imperfectly, hesitatingly toward it. (Knox described the motion as “authentic” and “organic.”) Then, Knox placed a blue block behind a wall of red blocks. The robot easily charted a path around the wall.

“Through real-world interaction, we were able to affect the development of its behavior.”

The next scenario was a little more challenging: An unbreakable barrier of red blocks encircling the robot and a blue block just beyond reach. Impressively, the robot broke through the barrier, inching backward and forward until it managed to create an opening in the barrier through which it could escape.

It’s an example of spontaneous behavior, Knox said — of the robot doing something the team didn’t train it to do. “Through real-world interaction, we were able to affect the development of its behavior.”

More: MekaMon robots battle in augmented reality so you don’t have to clean up the carnage

It’s not the only example. In play tests, users have placed blue blocks at the top of stacked red blocks, Knox said, and the robot has knocked them over. “Nowhere in the operations data is it told to push the blocks,” he said.

And this is just the beginning. Over-the-air software updates will enable new features like nonverbal signs of social interaction between robots, Knox said. If the Kickstarter campaign reaches its first stretch goal, users will be able to pit two robots against each other in a robot battle to the death. And enterprising programmers will be able to teach the robots new skills.

Knox believes these robots have disruptive potential. That’s thanks in part both their ease of use, he said — bots_alive leverages a smartphone for processing and a system of QR codes to track the cubes’ and robot’s position — and crucially to their price point. “It’s a fun and varied user experience,” he said, “and it’s affordable compared to other robots with cutting-edge AI.”

And when it comes to the software’s applicability, the sky’s the limit, Knox said. “It’s very easily translatable to any remote-controlled robot that’s controlled via Bluetooth,” he said. “We don’t have explicit plans, but one of the main things that we’re looking forward to in the Kickstarter campaign is what people would value. If there’s a very strong, resounding call, then we’ll consider it.”

More: New $27 million fund aims to save humanity from destructive AI

Bots_alive launches on January 24. It’s expected to ship later this year. For $60, you get the full kit, including the Hexbug Spider, decals, five vision blocks, an IR blaster, and the mobile app. If you pay $85, you get the same kit plus an extra Hexbug Spider. You can learn more on the company’s website or back it on Kickstarter right now.