Skip to main content

Nvidia faces attacks from AMD, Intel, and even Google. Should it be worried?

Image used with permission by copyright holder

“Ray tracing is here,” Nvidia CEO Jensen Huang optimistically proclaimed in a press interview at GTC. “It’s all about ray tracing, ray tracing, ray tracing.”

Nvidia’s wide support of ray tracing – from desktops to laptops to servers and the cloud – indicates a strong push and demand behind the new technology, despite early reports suggesting that initial adoption for RTX has been slow.

Nvidia confidence might be unshakable, but attacks from its competitors continue to grow in number. AMD has 7nm GPUs, Google is launching a game streaming platform, and even Intel is planning its re-entrance into the graphics game. But from Nvidia’s perspective, it’s only more evidence that it’s headed in the right direction.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

The game streaming wars

Image used with permission by copyright holder

Bringing RTX to servers will open Nvidia up to more audiences and new possibilities. Designers and creatives can rely on the RTX cloud to render images in real-time and collaborate on projects through the new Nvidia Omniverse platform, while gamers can leverage the power of Nvidia graphics on the GeForce Now game streaming platform, even if their local rigs don’t ship with high-end graphics.

But Nvidia has some new competition in this realm.

While Nvidia was promoting GeForce Now, which is in its second year as beta, Google announced Stadia, a competing game streaming platform at the Games Developer Conference (GDC) just fifty miles north of GTC in San Francisco, California. Though Huang admitted he doesn’t know specifics behind Stadia, which operates off of rival AMD’s custom graphics on Google’s servers, Nvidia is leveraging the familiarity of the GeForce brand in its cloud approach to gaming.

“What we decided to do essentially in a world where gaming is free to play is to build the service for the billion PC customers who don’t have GeForces,” Huang said, alluding to the fact that GeForce Now is targeted at gamers who want enthusiast-level performance but lack the resources to make that happen. “For them, getting access to GeForce in the cloud must be very nice because their PC isn’t strong enough or too old, or the game doesn’t run on Linux or a Mac.”

Nvidia GeForce Now - Cloud Gaming Preview

Driven by its relationships with publishers and the economics of the gaming industry, Nvidia is careful to point out that you must own the title to play, and it’s not creating an all-access subscription service.

“We don’t believe that Netflix for gaming is the right approach,” Huang said. According to Nvidia, the decision to play a game, especially for PUBG titles, is generally driven by what a gamer’s friends are already playing, so a Netflix model used to surface and discover new titles won’t work. “So our strategy is to leave the economics completely to the publishers, to not get in the way of their relationship with the gamers. Our strategy with GeForce Now is to build the servers and host the service on top.”

In its second year as beta, GeForce Now, which can be used to stream more than 500 games, is now home to more than 300,000 gamers with a wait list of a million strong. Nvidia will be upgrading the experience of GeForce Now to enable RTX as early as the second or third quarter this year. “The next build of GeForce Now servers from now on will be RTX, so ray tracing on every single server,” Huang said.

As Nvidia continues to make investments in game streaming, it’s looking to scale the service, drive down costs to be able to allow gamers to access free-to-play titles, and build more data centers at the edge as part of its hybrid cloud approach in order to minimize latency and delays. Through partnerships with telecoms, Nvidia is hoping to build servers in every country in the world.

Threats from AMD and Intel

Nvidia has been making bets on graphics as the way forward for computing since its founding. And with Moore’s Law showing its limitation on the CPU side, Nvidia is confident that solving the world’s most meaningful problems requires a strong GPU. With more data collected today than ever before, the process of making sense of all that information requires a lot of power. “Our GPU has made it possible for computation to be done very quickly,” Huang said.

“The CPU is too rare a resource, so you have to offload when you can,” he explained, while welcoming Intel into the graphics market. “Even Intel believes that accelerators are the right path forward, and that’s a good thing. Quantum computing is still decades away, so why don’t we find an approach that is practical, that is here now based on linear computing. So for them to acknowledge that is really fantastic.”

It’s not just newcomers like Intel that is looking to erode Nvidia’s leadership in the graphics market. Rival AMD briefly dominated the conversation in graphics technology earlier this year when it became the first to announce the 7nm architecture for its Radeon Vega VII graphics, boasting a boost in performance and better thermals. In comparison, the Turing architecture that Nvidia introduced for RTX is based on a larger 12nm design. Huang, however, appeared unphased by what AMD is doing.

AMD CEO Lisa Ku
AMD CEO Lisa Ku AMD

“7nm process is open for sale,” Huang mocked. “TSMC would love to sell it to us. What is the genius of a company if we just buy somebody else’s wafers? And what is the benefit of your contribution to their wafers?” Nvidia’s genius is its engineering, and for Huang, the results are about performance and energy, not size alone.

Despite utilizing a larger architecture, Huang claimed that the superior Nvidia engineering allows Turing to outperform its rival. “The energy efficiency is so good, even compared to someone else’s 7nm. It is lower cost, it is lower energy, it has higher performance, and it has more features.”

The best technology that made sense at the time for Turing was a 12nm Nvidia-engineered FinFET, and Nvidia had invested considerable time and money to engineer the chip’s architecture with TSMC to deliver the performance that it wanted.

As competition heats up in graphics, game streaming, and data centers, Huang remains optimistic on what Nvidia can deliver “because we have good engineers and the software is excellent.”

Editors' Recommendations

Chuong Nguyen
Silicon Valley-based technology reporter and Giants baseball fan who splits his time between Northern California and Southern…
Here’s how AMD counters Nvidia’s big RTX Super launch
RX 7900 XTX and RX 7900 XT on a pink background.

Well done, AMD. Today is a big day for Nvidia -- after all, today is when the RTX 4070 Super hits the shelves. So what does AMD do? It serves up a huge price cut on one of its top graphics cards in this generation to strike back at Nvidia and counter its big release. Coincidence? No way. But will this price tag be low enough when Nvidia launches the RTX 4070 Ti Super?

AMD's RX 7900 XT is the GPU that's now a lot cheaper, with an official price of $750, but some models are as cheap as $710. It initially launched at $900 and was never the most popular option out of AMD's two flagships. The RX 7900 XTX, priced at just $100 more, often made more sense due to its greatly improved performance. As such, the RX 7900 XT was a bit of a forgotten entry in AMD's lineup. Its price quickly plummeted to $800 and even lower at certain retailers.

Read more
Intel may already be conceding its fight against Nvidia
Two intel Arc graphics cards on a pink background.

Nvidia continues to own the top-of-the-line GPU space, and the competition just hasn't been able to, well, compete. The announcement of the impressive-sounding RTX 40 Super cards cements the lead even further.

As a result, AMD is said to be giving up on the high-end graphics card market with its next-gen GPUs. And now, a new rumor tells us that Intel might be doing the same with Arc Battlemage, its anticipated upcoming graphics cards that are supposed to launch later this year. While this is bad news, it's not surprising at all.
Arc Battlemage leaks
First, let's talk about what's new. Intel kept quiet about Arc Battlemage during CES 2024, but Tom Petersen, Intel fellow, later revealed in an interview that it's alive and well. The cards might even be coming out this year, although given Intel's track record for not meeting GPU deadlines, 2025 seems like a safer bet. But what kind of performance can we expect out of these new graphics cards? This is where YouTuber RedGamingTech weighs in.

Read more
Nvidia vs. AMD vs. Intel: Who’s the winner of CES 2024?
Michelle Johnston Holthaus holding an Intel Core Ultra CPU.

Nvidia, AMD, and Intel all unveiled some of the best PC hardware of CES 2024. With plenty of processors for both mobile and desktop, as well as some new graphics cards, the three PC giants came in with plenty to offer -- but which one did it best?

We have an interesting lineup from Intel, which expanded its list of top CPUs with chips ranging from the entry-level Core i3 to the high-end Core i9-HX. AMD served up some revolutionary APUs and a new graphics card. Meanwhile, Nvidia did something unprecedented -- it actually offered better value in its new GPUs. Let's compare the three and determine who won this year's CES 2024.
Nvidia: Super cards and lots of AI

Read more