Skip to main content

Facebook builds virtual homes to train A.I. agents in realistic environments

Researchers at Facebook, which is no stranger to artificial intelligence, have created Habitat, a platform that allows A.I. agents to rapidly learn about the physical world by living in realistic environments within virtual homes.

Teaching A.I.-powered robots to accomplish tasks within the real world takes a significant amount of time. While it may be possible to have actual robots move about in physical spaces for their training, it can take hundreds of hours, an even years, for the A.I. agents to learn how to move from one place to another, determine objects, and answer questions about their surroundings.

In comparison, the A.I. may be placed within virtual homes that represent real ones, with the speed of training dependent on how fast a computer can run the calculations of the 3D world. This means that the A.I. agents will be able to learn thousands of hours training in as short as a few minutes, instead of robots woefully bumping into walls when trained in the real world. In addition, while other simulation engines run at about 50 to 100 frames per second, Habitat is optimized to run at 10,000 frames per second, enabling the rapid training for A.I. agents.

Habitat itself is not the simulated homes, but rather the platform that can host the virtual environments. It is already compatible with several 3D datasets, such as MatterPort3D, Gibson, and SUNCG, but to make them more realistic, Facebook created a database for the platform named Replica.

Replica is a collection of photorealistic rooms that were created through a combination of photography and depth mapping of real homes. The 3D modules of the different rooms may be pieced together in any kind of configuration, depending on the training desired for A.I. agents.

The worlds within Habitat are only visual though, which means that the A.I. agents will not be able to interact with objects in the virtual homes. They may be able to learn how to move from the bedroom to the kitchen, but they will not be able to pick up a spoon, for example. The lack of the ability to interact with the environment is a glaring limitation for the platform, but the Facebook researchers are looking to add the function soon.

Editors' Recommendations

Aaron Mamiit
Aaron received a NES and a copy of Super Mario Bros. for Christmas when he was 4 years old, and he has been fascinated with…
How Nvidia is using A.I. to help Domino’s deliver pizzas faster
Domino's delivery in line.

Nvidia announced a new tool that can help deliver your pizzas faster -- yes, really -- at its fall GTC 2021 event. It's called ReOpt, and it's a real-time logistics tool that Domino's is already using to optimize delivery routes based on time and cost.

ReOpt is a set of logistics-planning algorithms that can find billions of routes to the same location. It utilizes heuristics powered by GPU computing to route vehicles in the most efficient way possible. It's like Google Maps, just way more complex and designed specifically to meet the needs of last-mile delivery.

Read more
The funny formula: Why machine-generated humor is the holy grail of A.I.
microphone in a bar

In "The Outrageous Okona," the fourth episode of the second season of Star Trek: The Next Generation, the Enterprise's resident android Data attempts to learn the one skill it has previously been unable to master: Humor. Visiting the ship’s Holodeck, Data takes lessons from a holographic comedian to try and understand the business of making funny.

While the worlds of Star Trek and the real world can be far apart at times, this plotline rings true for machine intelligence here on Earth. Put simply, getting an A.I. to understand humor and then to generate its own jokes turns out to be extraordinarily tough.

Read more
Nvidia’s latest A.I. results prove that ARM is ready for the data center
Jensen Huang at GTX 2020.

Nvidia just published its latest MLPerf benchmark results, and they have are some big implications for the future of computing. In addition to maintaining a lead over other A.I. hardware -- which Nvidia has claimed for the last three batches of results -- the company showcased the power of ARM-based systems in the data center, with results nearly matching traditional x86 systems.

In the six tests MLPerf includes, ARM-based systems came within a few percentage points of x86 systems, with both using Nvidia A100 A.I. graphics cards. In one of the tests, the ARM-based system actually beat the x86 one, showcasing the advancements made in deploying different instruction sets in A.I. applications.

Read more