Skip to main content

Forget cloning dogs, A.I. is the real way to let your pooch live forever

 

Once superintelligence arrives, we don’t know if it’s going to be on our side or against us. But in the meantime, you could do a lot worse than training artificial intelligence to respond like humanity’s best friend, the dog. That’s what researchers from the University of Washington and the Allen Institute for Artificial Intelligence set out to do recently with a new deep-learning A.I. that is designed to predict how dogs would respond in any given situation.

Recommended Videos

“The goal of the project is to train statistical models that behave like the brain of a dog,” Kiana Ehsani, one of the researchers on the project, told Digital Trends. “We try to predict, based on what the dog sees, how she will move her joints, follow the owner, fetch treats and toys, and in general react to the outside world.”

To create their unlikely A.I., the researchers fixed a range of sensors to an Alaskan Malamute named Kelp M. Redmon. These included a GoPro and microphone on its head, inertia sensors on its body, legs and tail, and an Arduino unit on the back to collect the data. They then let the dog go about its data activities, such as playing in the park.

Once more than 20,000 frames of video had been collected, the researchers used this to train their A.I. They were interested in three main goals: Predicting future movements, planning tasks, and learning doggy behavior. The hope is that they will be able to present the dog A.I. with scenarios — like spotting a squirrel — and then accurately modeling a response. Of the 24,500 frames of video collected, 21,000 were used to train the A.I., and the remainder to test its performance.

Right now, the A.I. isn’t hooked up to a physical body, but that could soon change. The team is interested in using their A.I. to create a realistic robot dog. This might have applications in training robots to carry out tasks like route planning with greater efficiency. There is also an altogether more intriguing use.

“Another application would be making a robot dog that acts exactly the same as your real dog,” Ehsani said. “The emotional reactions and their interests will be the same. It’s like making your dog live forever.”

Hey, that certainly beats Barbra Streisand’s Black Mirror-come-to-life scenario of just cloning her dog over and over again.

Luke Dormehl
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
The funny formula: Why machine-generated humor is the holy grail of A.I.
microphone in a bar

In "The Outrageous Okona," the fourth episode of the second season of Star Trek: The Next Generation, the Enterprise's resident android Data attempts to learn the one skill it has previously been unable to master: Humor. Visiting the ship’s Holodeck, Data takes lessons from a holographic comedian to try and understand the business of making funny.

While the worlds of Star Trek and the real world can be far apart at times, this plotline rings true for machine intelligence here on Earth. Put simply, getting an A.I. to understand humor and then to generate its own jokes turns out to be extraordinarily tough.

Read more
Nvidia’s latest A.I. results prove that ARM is ready for the data center
Jensen Huang at GTX 2020.

Nvidia just published its latest MLPerf benchmark results, and they have are some big implications for the future of computing. In addition to maintaining a lead over other A.I. hardware -- which Nvidia has claimed for the last three batches of results -- the company showcased the power of ARM-based systems in the data center, with results nearly matching traditional x86 systems.

In the six tests MLPerf includes, ARM-based systems came within a few percentage points of x86 systems, with both using Nvidia A100 A.I. graphics cards. In one of the tests, the ARM-based system actually beat the x86 one, showcasing the advancements made in deploying different instruction sets in A.I. applications.

Read more
Nvidia’s new voice A.I. sounds just like a real person
Nvidia Voice AI

The "uncanny valley" is often used to describe artificial intelligence (A.I.) mimicking human behavior. But Nvidia's new voice A.I. is much more realistic than anything we've ever heard before. Using a combination of A.I. and a human reference recording, the fake voice sounds almost identical to a real one.

All the Feels: NVIDIA Shares Expressive Speech Synthesis Research at Interspeech

Read more