In the wipes that blend scenes rendered in DirectX11 and DirectX12 together, not only do they appear crisper, more detailed, and with a lot more atmospheric fog and dust effects, most scenes appear more colorful, too.
Of course this may be because it’s much harder to show frame rates in a video and make that exciting, but performance is the big gain coming from DX12, among a few other features. That’s why there’s a bunch of gaming footage thrown in there as well, even if (as WinBeta points out), Gears of War: Ultimate Edition hasn’t had the smoothest of launches.
In reality, as great as DX12 is for gamers, it may end up being more of a boon for developers. Much like AMD’s original Mantle API and the recently unveiled Vulkan built upon its skeleton, the idea with DX12 is to give developers a much closer to the ‘bone’ platform to work on, letting them leverage real, raw CPU and GPU power without interruption.
It’s meant that in games like Ashes of the Singularity, which is the first game to have a DX12 benchmark available, thousands of units can be present in a single scene, which makes that Stardock strategy look quite unlike any game that has come before it – even in a hotly contested genre like RTS.
When combined with the new generation of high-performing graphics cards from both AMD and Nvidia set to launch this year, we could be looking at a real leapfrogging jump in visual fidelity and overall performance in gaming over the next year or so.
- AMD Radeon VII will support DLSS-like upscaling developed by Microsoft
- ‘Gears of War’ movie brings in ‘xXx: Return of Xander Cage’ screenwriter
- Intel Xe graphics: Everything you need to know about Arctic Sound
- AMD is pulling ahead in the die shrink race with 7nm CPUs and graphics cards
- AMD’s Graphics Core Next successor could give a big boost to parallel computing