Skip to main content

How imperceptible vibrations could take augmented reality to a new level

Pokemon GO and Interactive Dynamic Video
Pokémon Go is the biggest breakout hit of the year, and though it may be starting to slip from its colossal popularity peak, it’s still a very well played game that millions enjoy on a daily basis. Part of what made it so eye-catching is its augmented reality feature. But as cool as that is, having a psuedo-hovering Pokémon superimposed over the real world doesn’t feel very much like reality.

To make the game more immersive, we’d need some way for the pocket monsters to interact with the environment they’re in, and have it react back. How would that be possible? A research team at MIT believes it’s found a way — through the use of micro-vibrations.

“Essentially, we’re looking at different frequencies of vibration, which represent a different way that an object can move. By identifying those shapes and frequencies, we can predict how an object will react in new situations,” Abe Davis, the lead researcher on the project, told Digital Trends. Along with fellow researchers Justin Chen and Fredo Durand, they’ve built upon previous research they conducted on the concept of visual microphones, to draw even more data from standard video.

“This might be something that’s more suitable for Pokémon Go 4 or 5, than Pokémon Go 2.”

“One way to think about it, is if I point my camera at a bush and I watch the wind rustle that bush for a whole minute, I’m watching a bunch of tiny movements of the bush, which are responses to various forces,” Davis explained.

Those movements are categorized as vibrations operating at various frequencies. Then, software can take the video and analyze those vibrations. It can figure out the types of forces at play to create those movements, and then guess at how larger forces, or different combinations of those same forces, may make the object react.

By recording the bush’s reaction to the wind, the software can eventually figure out how it might react to a brick — or Pikachu.

Bringing pocket monsters to life

Extrapolating more than just visual data from video became a focus of Davis’ interest throughout his time at MIT, and it was ultimately the core of his dissertation. However, explaining just how visual data from a video can be used beyond the norm isn’t easy. When Pokémon Go was released, he saw a great way to break it down.

Davis is a Pokémon Go player, having reached level 19 at the time we conducted our interview. We were even introduced to his most powerful Pokémon — Fluffles, a CP 1,592 Arcanine, who’s been tearing up the gyms in his local area. Fluffles was caught at the SIGGRAPH conference where Davis and his fellow researchers first showed off vibration model technology.

To use Pokémon as a showcase, Davis set up his phone on a tripod pointing in a specific direction. He then proceeded to catch a Pokémon, and then captured footage from that exact same position.

“I caught it and recorded about a minute of video, using the tripod for stability. I took that video back and processed it using the code I had written,” and the result was the video you see above.

A bush from the real world, which reacts (somewhat realistically) with a digital creation, is much closer to the sort of augmented reality future we’ve all been promised. Indeed, it even goes further than some of the things we’ve seen with Microsoft’s Hololens and the Magic Leap. Because of that, we shouldn’t expect this sort of technology to appear in the next Pokémon Go patch, or even its sequel.

“This might be something that’s more suitable for Pokémon Go 4 or 5, than Pokémon Go 2,” Davis cautioned.

That said, Davis and his fellow researchers had been working on this well before Pokémon Go was released, and there are many other potential applications for this technology beyond catching pocket monsters.

Shaking seconds off rendering CGI

What if, instead of rendering are entire explosion of an object or building, film makers could simply record video of an object and utilize this sort of algorithm to create a barebones animation? This has the potential to save huge chunks of time.

Of course, the artificially created movement that Davis has shown doesn’t look quite as good as the latest CGI blockbuster, but that’s not due to the weakness of the technique. Davis simply isn’t an artist. He has no idea how to polish his algorithm’s results.

“If you gave this tool to the world’s best artists, I suspect you could make it look really good.”

“The most expensive CGI is the most expensive CGI, because you pay the most expensive artists to do the most expensive art,” Davis said, jokingly. “If you gave this tool to the world’s best artists, I suspect you could make it look really good.”

“It’s about giving artists the best starting point. That’s how a lot of technology and special effects are used. If you want to make something look really good, you don’t want a canned solution. You want your artists to dictate every aspect of the look and feel of the final product.”

Another exciting use for the technology may be found in architecture, as well as insurance, where the tech could be used for structural health monitoring. Vibration modes and frequencies are already used in that profession, but they utilize much more complicated capture techniques to acquire the data.

“Typically that data is captured through lasers and accelerometers that have to placed on the object. The big advantage [with my technique], is that it’s very easy to point a camera at a building, but it’s pretty hard to paint a whole building with accelerometers or laser points,” said Davis. “This offers a convenient way to capture slightly lower quality data, which is great to figure out where you need to focus your attention.”

vibrationsbuilding
Abe Davis

If a company can test a building’s structural integrity by just recording some video of it and throwing an algorithm at it, it’d be possible for an intern with a camera to do work that’d previously demand a team of engineers.

Frame rates, resolution and magnification

Obviously, a commercial camera is much cheaper, easier to acquire and easier to operate than the technology this technique could help supplant. But there are certain hardware requirements that have a big effect on how well the algorithm works.

As with most video, a tripod is essential. While it wouldn’t be too difficult to separate out vibrations that effect the entire video, versus those that effect subjects within it, that’s a step that can be practically eliminated by using a sturdy vantage point for the camera to rest.

The type of camera, and its quality, can be important, too.

“The frame rate of the camera can actually determine what frequencies you can recover,” Davis said. “If you’re doing special effects, the frequencies you want to simulate are the frequencies that you can see, so frame rate isn’t so important. However, if you wanted to simulate a detailed solid object, then having higher frequencies which are captured at a higher frame rate is going to help.”

In one instance, Davis and his team wanted to track the vibrations from a Ukelele. But because of the way the strings on such an instrument vibrate, it was very important to use a high-frame-rate camera.

Conclusion

With all of the potential uses of the video vibration analysis work that Davis and his peers have been conducting, where does the technology go from here?

Although Davis plans to continue working on it in the future, he doesn’t have any immediate plans to leverage it for financial gain. There will be no micro-vibrations-from-video start up that Google or some other mega-corporation buys out in the near future. Part of that is because MIT owns the patent, having defensively applied for it.

However, you have to imagine that the likes of Microsoft and Magic Leap will be keeping an eye on this sort of technology, as it could be great for augmented reality.

Davis himself has now finished his dissertation, a comprehensive paper on all of his MIT conducted research, and will be graduating this September, before moving on to Stanford University for his post-doctorate.

For more information on any of Davis’ research, you can find all of his papers and studies on his official site. He also covered several aspects discussed here in his Ted Talk.

Editors' Recommendations

Jon Martindale
Jon Martindale is the Evergreen Coordinator for Computing, overseeing a team of writers addressing all the latest how to…
This Dell 15-inch Windows laptop is in the sale bin for $300
Someone using the Dell Inspiron 15 on their lap.

If you need a new laptop but you don't need an overly powerful machine, you should check out Dell's offer for the Dell Inspiron 15. From an already affordable original price of $380, it's currently even cheaper at just $300 following an $80 discount. There's always a lot of demand for laptop deals featuring dependable devices like this one, so you shouldn't take too much time thinking about it. Proceed with the purchase immediately if you want to pocket the savings.

Why you should buy the Dell Inspiron 15
Dell is one of the best laptop brands, because its machines are excellent choices for work or school -- and that extends to its budget offerings like the Dell Inspiron 15. With the 12th-generation Intel Core i3 processor, integrated Intel UHD Graphics, and 8GB of RAM, it's not going to challenge the best laptops in terms of performance, but it's more than enough for basic tasks. Activities like doing online research, building presentations, browsing social media, and watching streaming content won't be a problem with the Dell Inspiron 15, though you may consider upgrading to 16GB of RAM if you need a boost in processing power for your apps, as recommended by our guide on how much RAM do you need.

Read more
This Dell 2-in-1 laptop is over $1,100 off right now (seriously)
Dell Latitude 9330 tent view showing display and hinge.

If you're looking at 2-in-1 laptop deals because you like the versatility of these devices, you're going to love this opportunity to get the Dell Latitude 7430 2-in-1 laptop with a discount of $1,118. It's currently available from Dell for $1,009, for savings of $1,018 on its original price of $2,027, but you'll get an extra $100 off with the code SAVE100, which pulls its price down even further to $909. That's an absolute steal for this machine, so you better hurry with your purchase because we're not sure when the bargain will disappear.

Why you should buy the Dell Latitude 7430 2-in-1 laptop
The Dell Latitude 7430 falls under the convertible category of 2-in-1 laptops, according to our laptop buying guide. That means you can quickly and easily switch from laptop mode to tablet mode by folding its 14-inch touchscreen with Full HD resolution all the way back to below its keyboard. Not only is the device portable, but it's also capable of matching your needs at any given moment. You'll be able to use the keyboard for typing documents in laptop mode, and maximize the touchscreen for using apps in tablet mode, for example.

Read more
Usually $995, this Dell work-from-home laptop is $449 today
Dell Latitude 3420 on a desk hooked up to a monitor.

One of the best laptop deals comes courtesy of Dell and is perfect for anyone on a budget. Currently, you can buy the Dell Latitude 3420 laptop for $449 saving you $545 off the regular price of $995. It's currently on the Dell website for $499 but if you use the code SAVE50, you save an additional $50 bringing it down to its excellent new price. If you're in the market for a cheap laptop, take a look at what it offers below.

Why you should buy the Dell Latitude 3420
With Dell being one of the best laptop brands, you can be safe in the knowledge you get good value for money with the Dell Latitude 3420. It has an 11th-generation Intel Core i3 processor along with 8GB of memory and 256GB of SSD storage. Alongside that is a better screen than you'd normally get at this price. It has a 14-inch full HD display with 1920 x 1080 resolution and 250 nits of brightness.

Read more