Skip to main content

New advance in 3D graphics may make the next Avengers movie look even more realistic

researchers rendered light metal wood glints snail
Image used with permission by copyright holder
Are you having trouble distinguishing real life objects from digital creations in the latest movies and TV shows? Well, things are only going to get harder to determine from here on out, thanks to a new method discovered by a group of researchers.

In short, they’ve figured out how to improve the way graphics software can render light as it interacts with extremely small details on the surface of materials. That means you could finally see the metallic glitter in Captain America’s shield, the tiny sparkles in the Batmobile’s paint job, and super-realistic water animations.

As a brief explainer, the reflection of light emanating from a material’s small details is called “glints,” and until now, graphics software could only render glints in stills. But according to Professor Ravi Ramamoorthi at the University of California San Diego, his team of researchers have improved the rendering method, enabling software to create those glints 100 times faster than the current method. This will allow glints to be used in actual animations, not just in stills.

Ramamoorthi and his colleagues plan to reveal their rendering method at SIGGRAPH 2016 in Anaheim, California, later this month. He indicated that what drove him and his team to create a new rendering technique was today’s super-high display resolutions. Right now, graphics software assumes that the material surface is smooth, and artists fake metal finishes and metallic car paints by using flat textures. The result can be grainy and noisy.

“There is currently no algorithm that can efficiently render the rough appearance of real specular surfaces,” Ramamoorthi said in a press release. “This is highly unusual in modern computer graphics, where almost any other scene can be rendered given enough computing power.”

Ocean
Image used with permission by copyright holder

The new rendering solution reportedly doesn’t require a massive amount of computational power. The method takes an uneven, detailed surface and breaks each of its pixels down into pieces that are covered in thousands of microfacets, which are light-reflecting points that are smaller than pixels. A vector that’s perpendicular to the surface of the material is then computed for each microfacet, aka the point’s “normal.” This “normal” is used to figure out how light actually reflects off the material.

According to Ramamoorthi, a microfacet will reflect light back to the virtual camera only if its normal resides “exactly halfway” between the ray projected from the light source and the ray that bounces off the material’s surface. The distribution of the collective normals within each patch of microfacets is calculated, and then used to figure out which of the normals actually are in the halfway position.

Ultimately, what makes this method faster than the current rendering algorithm is that it uses this distribution system instead of calculating how light interacts with each individual microfacet. Ramamoorthi said that it’s able to approximate the normal distribution at each surface location and then compute the amount of net reflected light easily and quickly. In other words, expect to see highly-realistic metallic, wooden, and liquid surfaces in more movies and TV shows in the near future.

To check out the full paper, Position-Normal Distributions for Efficient Rendering of Specular Microstructure, grab the PDF file here. The others listed in Ramamoorthi’s team include Ling-Qi Yan, from the University of California, Berkeley; Milos Hasan from Autodesk; and Steve Marschner from Cornell University. The images provided with the report were supplied to the press by the Jacobs School of Engineering at the University of California San Diego.

Is that water render just awesome, or what?

Editors' Recommendations

Kevin Parrish
Former Digital Trends Contributor
Kevin started taking PCs apart in the 90s when Quake was on the way and his PC lacked the required components. Since then…
Windows 11 may get the 3D emojis we were promised
Clippy returns to Windows 11 as an emoji.

Microsoft shipped new emojis in Windows 11 last year, but they caused quite some controversy as the emojis weren't actually 3D as the company first teased. There was never an explanation for that change in design, but it is now looking as though the originally promised 3D emoji could still be in the works.

Though Microsoft itself hasn't recently said anything about 3D emoji on its official channels, one of its employees sent indications about it. In a response to a tweet from a Windows 11 user lamenting the lack of 3D emoji, Nando Costa, who is the visual artist and design leader at Microsoft Design, said: "Thank you and agreed! We're working on that."

Read more
AMD CES 2022: Ryzen 6000, 3D VCache, 6000S graphics, and more
Dr. Lisa Su, revealing Ryzen 6000 at CES 2022.

AMD's CES 2022 keynote speech kicked off the year's biggest tech show with a bang. With Nvidia and Intel firmly in its sights, AMD brought the big guns to bear and debuted new 3D VCache gaming processors, expanded Radeon 6000M and 6000S mobile graphics, a new 6500 XT graphics card, and a massive expansion of AMD's FidelityFX Super Resolution technology.

There was even a sneak peak of what AMD's upcoming Zen 4 CPUs will be capable of.

Read more
Yamaha’s new 3D ANC headphones take aim at the AirPods Max
Yamaha's YH-L700A 3D noise-canceling headphones.

Yamaha's latest wireless headphones, the $500 YH-L700A, appear to be priced and designed as the ultimate alternative to Apple's $549 AirPods Max. In addition to active noise cancellation (ANC) and transparency modes, the YH-L700A feature the company's version of spatial audio, which it enables via head-tracking -- the same technology that Apple uses in its AirPods Pro and AirPods Max. They're available starting today on Yamaha's website and soon from retailers like Best Buy.

Yamaha calls the spatial audio feature 3D Sound Field with head-tracking. Unlike spatial audio on the AirPods Max, which requires an Apple device like an iPhone or Apple TV 4K, plus the support of streaming apps like Apple Music, Netflix, or Disney+, Yamaha's tech is much more universal. According to the company, the headphones can convert traditional two-channel stereo into a 360-degree sound field at the push of a button. You can also choose between seven 3D sound modes depending on what you're listening to. These include Cinema, Drama, and Music Video (for video content), Audio Room and Background Music (for music), and Outdoor Live and Concert Hall modes (for performances).

Read more