Skip to main content

MIT researchers develop simulation to manipulate real-world objects in videos

Interactive Dynamic Video
Pokémon Go just passed the 100 million player mark — and it’s still growing. The game’s success can be largely credited to millennial nostalgia but also to the app’s clever use of augmented reality (AR), which overlays the Pokémon universe on the real world via the smartphone’s camera.

But, even though you can toss a virtual Poké Ball at a virtual Pikachu on the sidewalk, traditional augmented reality doesn’t let virtual objects interact with objects in the real world. Pikachu can’t electrify a puddle, bump into a signpost, or jump up on the curb.

A team of Massachusetts Institute of Technology researchers led by Abe Davis, a Ph.D student at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), and Justin Chen, a post-doctoral associate in its Department of Civil and Environmental Engineering, might help smooth these distinctions between augmented reality and the real world by developing Interactive Dynamic Video (IDV), which they think can change industries from engineering to entertainment.

Two years ago, Davis and his team recorded silent videos of objects that vibrated when they were hit by sound. Using a processing algorithm, the researchers were able to recover music and speech from these tiny vibrations — effectively eavesdropping through a bag of potato chips. This discovery enticed Davis to explore vibrations further and inspired him to see if an algorithm could re-create an object’s movement by analyzing the object’s supersmall vibrations.

“There are some classic techniques used in engineering and physically based animation to simulate objects, but they depend on a full 3D model,” Davis told Digital Trends. “I got this crazy idea that you could do something similar with a video instead, and it ended up working surprisingly well.”

Davis and his team used a traditional camera to record various objects as they vibrated ever so slightly. They recorded a bush as it blew in the breeze and a guitar string as it was lightly touched. The algorithm then identified “vibrations modes” — how a particular object might sway this way or rock that way when disrupted by vibrations. From here, the researchers can use a simulation to virtually interact with the object.

So how could this apply to Pokémon Go?

Pokemon GO and Interactive Dynamic Video

“There is a broad range of applications for this work,” Davis said, “from monitoring the safety of bridges and buildings, to product testing, to special effects for movies, and ‘dynamic augmented reality,'” which can interact with and respond to virtual forces.

Davis admits, however, that it will still be a while before this technology ends up on mobile devices. Instead, it will likely be applied to movie special effects, as well as structural health monitoring to predict the safety of old bridges.

So, although we won’t soon see Pikachu virtually jump-start our car, we may see IDV help make our world a little more safer.

Editors' Recommendations

Dyllan Furness
Dyllan Furness is a freelance writer from Florida. He covers strange science and emerging tech for Digital Trends, focusing…
It’s time to stop believing these PC building myths
Hyte's Thicc Q60 all-in-one liquid cooler.

As far as hobbies go, PC hardware is neither the cheapest nor the easiest one to get into. That's precisely why you may often run into various misconceptions and myths.

These myths have been circulating for so long now that many accept them as a universal truth, even though they're anything but. Below, I'll walk you through some PC beliefs that have been debunked over and over, and, yet, are still prevalent.
Liquid cooling is high-maintenance (and scary)

Read more
AMD’s next-gen CPUs are much closer than we thought
AMD Ryzen 7 7800X3D held between fingertips.

We already knew that AMD would launch its Zen 5 CPUs this year, but recent motherboard updates hint that a release is imminent. Both MSI and Asus have released updates for their 600-series motherboards that explicitly add support for "next-generation AMD Ryzen processors," setting the stage for AMD's next-gen CPUs.

This saga started a few days ago when hardware leaker 9550pro spotted an MSI BIOS update, which they shared on X (formerly Twitter). Since then, Asus has followed suit with BIOS updates of its own featuring a new AMD Generic Encapsulated Software Architecture (AGESA) -- the firmware responsible for starting the CPU -- that brings support for next-gen CPUs (spotted by VideoCardz).

Read more
AMD Zen 5: Everything we know about AMD’s next-gen CPUs
The AMD Ryzen 5 8600G APU installed in a motherboard.

AMD Zen 5 is the next-generation Ryzen CPU architecture for Team Red and is slated for a launch sometime in 2024. We've been hearing tantalizing rumors for a while now and promises of big leaps in performance. In short, Zen 5 could be very exciting indeed.

We don't have all the details, but what we're hearing is very promising. Here's what we know about Zen 5 so far.
Zen 5 release date and availability
AMD confirmed in January 2024 that it was on track to launch Zen 5 sometime in the "second half of the year." Considering the launch of Zen 4 was in September 2022, we would expect to see Zen 5 desktop processors debut around the same timeframe, possibly with an announcement in the summer at Computex.

Read more