Skip to main content

OpenGL 4 Looks to Take on DirectX 11

Image used with permission by copyright holder

High-end gaming has been fixated on DirectX 11 technology lately, but that doesn’t mean OpenGL is out of the picture: the Khronos Group has just announced the OpenGL 4.0 specification, the first major update to the open graphics standard since the launch of OpenCL in late 2008. OpenGL 4 builds on the work of OpenCL by enabling modern application to tap into the computing power of graphics processors, boosting performance and freeing up a computer’s main processor for other tasks. In addition, the OpenGL 4 standard includes support for hardware-accelerated geometry tessellation (essentially, simplifying shapes), can render content and apply shaders with 64-bit accuracy, and sports improved shaders for better rendering quality and antialiasing capability.

“The release of OpenGL 4.0 is a major step forward in bringing state-of-the-art functionality to cross-platform graphics acceleration, and strengthens OpenGL’s leadership position as the epicenter of 3D graphics on the web, on mobile devices as well as on the desktop,” said OpenGL ARB working group chair (and Nvidia’s Core openGL senior manager) Barthold Lichtenbelt, in a statement.

Not all existing hardware will be able to handle OpenGL 4 capabilities; in the meantime, Khronos has released an OpenGL 3.3 specification to bring as much OpenGL 4 functionality to existing GPU hardware as possible. Nvidia says its forthcoming Fermi-based graphics systems will fully support OpenGL 4.0.

The advantage of OpenGl from a developer’s point of view is that applications targeting the OpenGL 3D graphics platform can be deployed across a broad range of devices, from high-end desktop systems down through mobile devices like the iPhone via OpenGL ES, and potentially even to Web browsers via WebGL, which aims to bring OpenGL ES 2.0 to Web browsers via HTML 5’s Canvas element—WebGL is being backed by Mozilla, Opera, Apple, and Google.

Editors' Recommendations

Geoff Duncan
Former Digital Trends Contributor
Geoff Duncan writes, programs, edits, plays music, and delights in making software misbehave. He's probably the only member…
Lenovo Chromebook Duet 3 vs. HP Chromebook x2 11
Lenovo Chromebook Duet 3 top down view showing keyboard and touchpad.

Several small Chrome OS tablets are on the market that compete directly with Apple's iPad. Two of the best are HP's Chromebook x2 11 and Lenovo's Chromebook Duet 3. The previous version of the Chromebook Duet, in fact, is on our list of best budget laptops, soon to be replaced by the newer model.

These are very similar Chromebooks in design, with the most significant differentiator being price. The Chromebook Duet 3 comes in at a budget-friendly $300, while the Chromebook x2 11 retails for $600 or more. Is HP's model worth the extra cash?
Specs

Read more
The Ryzen 5800X3D is a final celebration of AM4’s upgradability
AMD Ryzen 7 5800X3D.

Five years ago, you would have been considered an early adopter if you bought the humble Ryzen 1600X and placed it into your AM4 socket. Little did any of us know just how successful AMD's Ryzen line of processors would become, culminating in the newly released Ryzen 7 5800X3D.

But this final chip is more than just another powerful AMD processor or an innovative piece of technology. It's also a fitting swan song for the AM4 socket, the platform that has represented AMD's resurgence in the high-end CPU space, and a chip that could well see AM4 finish its run on top of the gaming performance pile. More than that, coming from an older Ryzen chip, you'll need to do little more than update the BIOS and plop this in your board to get it running.

Read more
You can overclock the Ryzen 7 5800X3D, but it’s still risky
AMD Ryzen 7 5800X3D chip.

Despite AMD's warning that the Ryzen 7 5800X3D cannot be overclocked, the CPU has just been spotted running far above its maximum boost frequency.

The processor managed to hit a clock speed of 4.8GHz. Does that mean that everyone should overclock the upcoming 3D V-Cache CPU? Not necessarily.

Read more