Skip to main content

Not even Nvidia’s RTX 4090 can handle Star Wars Jedi: Survivor

Nvidia’s RTX 4090 is overkill for a vast majority of games, but it seems to have met its match in Star Wars Jedi: Survivor. That’s right — even the best graphics card struggles to maintain more than 35 frames per second (fps) in EA’s upcoming title.

Of course, this kind of performance is unintended, and it stems entirely from poor optimization. Will your computer be able to handle the game when it launches tomorrow?

Cal holding his lightsaber with BD-1 on his shoulder.
Image used with permission by copyright holder

Star Wars Jedi: Survivor is just a day away from its official launch date, but it seems that EA’s efforts are far from over. Several reports from early players and reviewers indicate that the game is terribly optimized right now, with issues like low frame rates and insane VRAM usage plaguing those who already got to play it.

Recommended Videos

Many games launch well before they’re optimized — looking at you, Cyberpunk 2077 — but the problems that affect Jedi: Survivor seem bad enough to almost render it unplayable. There currently doesn’t seem to be a graphics card that can offer steady 60 fps, and that should never be the case when overpriced beasts like the RTX 4090 exist.

Please enable Javascript to view this content

EckhartsLadder on YouTube tried to run the game on a computer equipped with an RTX 3080 Ti. While a last-gen card, the RTX 3080 Ti is a high-end offering that should still be able to handle the most demanding games being released today. Unfortunately, this isn’t true for Jedi: Survivor — the YouTuber found that the frame rates maxed out at around 50 fps. It’s worth noting that even tweaking the settings did nothing to elevate those numbers. The game also had its fair share of other issues, such as audio problems and cutscenes that didn’t work.

GameStar, another YouTuber, also battled similar problems — but his computer is even more high-end. A rig equipped with an RTX 4090, 32GB of RAM, and a Ryzen 9 5900X couldn’t handle running Jedi: Survivor at 1440p. Let’s not forget that playing anything at 1440p is a waste with an RTX 4090 — that card was made for 4K. Still, at a lower resolution, it averaged around 35 to 45 fps with rare moments of hitting 60 fps in some areas.

In its current state, Jedi: Survivor is a massive VRAM hog. It uses up to 21GB of VRAM while only utilizing about 50% of the card’s total power. All in all, this doesn’t sound like an AAA game that’s ready to launch.

Cal’s newest journey in a galaxy far, far away has begun and we’re excited for you to experience it!

Our first patch will arrive on launch day across all platforms. In the weeks ahead, we’ll deploy patches that will:

– Fix bugs
– Improve performance
– Add more accessibility… pic.twitter.com/pUtyoGopP5

— EA Star Wars (@EAStarWars) April 26, 2023

Fortunately, it seems that everyone involved is trying to fix these problems before more people get access to Star Wars Jedi: Survivor. EA is promising that a launch day patch is coming soon, with optimizations and bug fixes awaiting. Pre-release patches are also being released regularly. On the driver side, Nvidia has recently rolled out a new Game Ready Driver that zones in on Jedi: Survivor with some game-specific optimizations.

Will all of these efforts turn out to be enough when the game is available to the general public? Let’s hope so because as things stand now, it might score some poor early reviews if the performance doesn’t improve.

Monica J. White
Monica is a computing writer at Digital Trends, focusing on PC hardware. Since joining the team in 2021, Monica has written…
Nvidia RTX 50-series GPUs: performance, specs, prices, availability
The RTX 5090 sitting next to the RTX 4090.

Nvidia has announced its new line of GPUs, the RTX 50-series -- and the first two are almost here, ready to rival the best graphics cards. We were already able to get our hands on the RTX 5090, which is why we now have a better idea of what these cards are capable of.

While we're still waiting for the RTX 5080, RTX 5070 Ti, and RTX 5070, we know that Nvidia is promising some huge leaps in performance, thanks to the new AI powers of DLSS 4. Here's everything you need to know about Nvidia's RTX 50-series.
RTX 50-series: pricing and release date

Read more
What power supply do you need for the RTX 5090 and RTX 5080?
The RTX 5090 sitting on top of the RTX 4080.

Nvidia’s new RTX 50-series GPUs represent a leap forward in gaming and content creation, but they also push the boundaries of what’s expected from your power supply. The RTX 5090 and RTX 5080, will be the first two models available for purchase starting January 30, and are expected to deliver improved performance over its predecessors -- you can already see that in action in our RTX 5090 review.

However, with great power comes greater demands on your power supply. If you're planning to upgrade to either of these next-generation graphics cards, it’s crucial to know what kind of PSU (Power Supply Unit) you need. Ensuring your PSU meets or exceeds the recommended specifications is critical for avoiding crashes, ensuring system stability, and maintaining long-term reliability.

Read more
Nvidia says melting power connectors are a thing of the past
The graphics card connectors on a power supply. The connectors are burned and melted from where an Nvidia 12VHPWR cable from an RTX 4090 graphics card has been plugged in and overheated.

Nvidia has expressed confidence that the infamous melting issues with the 12VHPWR power connectors, which plagued some RTX 40-series GPUs, will not recur with its next-generation RTX 50-series lineup.

As reported by QuasarZone, during the Nvidia RTX AI Day 2025 event in South Korea, Nvidia representatives assured attendees that the overheating and melting issues experienced with the RTX 4090's 12VHPWR connector have been resolved in the RTX 50 series. “We don’t expect that to happen with the RTX 50 series. We made some changes to the connector to respond to the issue at the time, and we know that it is not happening now, about two years later,” said an Nvidia representative.

Read more