Home > Virtual Reality > MIT researchers develop simulation to manipulate…

MIT researchers develop simulation to manipulate real-world objects in videos

Pokémon Go just passed the 100 million player mark — and it’s still growing. The game’s success can be largely credited to millennial nostalgia but also to the app’s clever use of augmented reality (AR), which overlays the Pokémon universe on the real world via the smartphone’s camera.

But, even though you can toss a virtual Poké Ball at a virtual Pikachu on the sidewalk, traditional augmented reality doesn’t let virtual objects interact with objects in the real world. Pikachu can’t electrify a puddle, bump into a signpost, or jump up on the curb.

A team of Massachusetts Institute of Technology researchers led by Abe Davis, a Ph.D student at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), and Justin Chen, a post-doctoral associate in its Department of Civil and Environmental Engineering, might help smooth these distinctions between augmented reality and the real world by developing Interactive Dynamic Video (IDV), which they think can change industries from engineering to entertainment.

RelatedAfter watching 600 hours of TV, this AI program can predict when people will hug or high five

Two years ago, Davis and his team recorded silent videos of objects that vibrated when they were hit by sound. Using a processing algorithm, the researchers were able to recover music and speech from these tiny vibrations — effectively eavesdropping through a bag of potato chips. This discovery enticed Davis to explore vibrations further and inspired him to see if an algorithm could re-create an object’s movement by analyzing the object’s supersmall vibrations.

“There are some classic techniques used in engineering and physically based animation to simulate objects, but they depend on a full 3D model,” Davis told Digital Trends. “I got this crazy idea that you could do something similar with a video instead, and it ended up working surprisingly well.”

Davis and his team used a traditional camera to record various objects as they vibrated ever so slightly. They recorded a bush as it blew in the breeze and a guitar string as it was lightly touched. The algorithm then identified “vibrations modes” — how a particular object might sway this way or rock that way when disrupted by vibrations. From here, the researchers can use a simulation to virtually interact with the object.

So how could this apply to Pokémon Go?

“There is a broad range of applications for this work,” Davis said, “from monitoring the safety of bridges and buildings, to product testing, to special effects for movies, and ‘dynamic augmented reality,'” which can interact with and respond to virtual forces.

Davis admits, however, that it will still be a while before this technology ends up on mobile devices. Instead, it will likely be applied to movie special effects, as well as structural health monitoring to predict the safety of old bridges.

So, although we won’t soon see Pikachu virtually jump-start our car, we may see IDV help make our world a little more safer.