Skip to main content

Samsung enters GPU tech licensing talks with AMD and Nvidia

Samsung Galaxy Note 7
Julian Chokkattu/Digital Trends
Over the past several years, Samsung has emerged as one of the biggest players in the smartphone market — in July, it was reported that the Galaxy S7 had managed to outsell the iPhone 6S for a three-month period. Now, there’s word that the company is preparing a change to its manufacturing procedure that could help secure an even more dominant position.

Samsung is apparently pursuing a longstanding desire to develop a GPU in-house for use with its mobile processors. The company is in talks with both AMD and Nvidia with a view to licensing its GPU technologies, according to a report from Sam Mobile.

As far back as 2014, there were rumors that Samsung had hired engineers from companies like AMD, Nvidia, and Intel in order to accelerate its plan for internal GPU development. At one point, it was expected that the firm would be able to implement its Exynos chipset with a proprietary GPU for the Galaxy Note 5 — but this never came to pass.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

At present, Samsung uses the Mali series of GPUs developed by ARM for its Exynos chipsets. However, based on these talks with AMD and Nvidia, it seems that the current arrangement may come to an end sooner rather than later.

Shifting to internal development of GPUs would allow Samsung to cut costs accrued outsourcing the components, even allowing for the licensing fees that would need to be paid to whichever company’s tech is being used. It remains to be seen whether the end goal is simply larger profit margins, or the ability to undercut the pricing of its rivals.

Given that the company is still in talks with AMD and Nvidia, the situation is still very fluid. However, this weekend’s reporting indicates that Nvidia is currently the front-runner thanks to the strength of its Pascal architecture.

Editors' Recommendations

Brad Jones
Former Digital Trends Contributor
Brad is an English-born writer currently splitting his time between Edinburgh and Pennsylvania. You can find him on Twitter…
I’ve reviewed every AMD and Nvidia GPU this generation — here’s how the two companies stack up
Three graphics cards on a gray background.

Nvidia and AMD make the best graphics cards you can buy, but choosing between them isn't easy. Unlike previous generations, AMD and Nvidia trade blows point-for-point in 2024, and picking a brand to go with isn't as easy as counting the dollars in your wallet.

I've reviewed every graphics card AMD and Nvidia have released this generation, comparing not only raw performance, but also features like DLSS and FSR, ray tracing performance, and how VRAM works in modern games. After dozens of graphics card reviews, here's how AMD and Nvidia stack up against each other in 2024.
Nvidia vs. AMD in 2024

Read more
You shouldn’t buy these Nvidia GPUs right now
RTX 4060 Ti sitting on a pink background.

Buying a new GPU in this generation is a bit of a tricky minefield of graphics cards to steer clear of. Sometimes, the performance is there, but the value is not; other times, you could get something much more capable for the same amount of money.

While Nvidia makes some of the best GPUs, it's certainly no stranger to that performance vs. value dilemma. Below, I'll show you three Nvidia graphics cards you're better off avoiding right now and tell you their much better alternatives.
RTX 4060 Ti

Read more
Don’t buy a cheap GPU in 2024
AMD RX 7600 on a pink background.

I wouldn't spend less than $500 on a new graphics card in 2024. I understand that budget is out of the question for many PC gamers, and I'm not advocating for higher GPU prices going forward. But with the games available today, it just doesn't make sense to settle for a budget GPU that will struggle the moment you take it out of the box.

We got a taste of the problem last year with games like The Last of Us Part One, Resident Evil 4, and Hogwarts Legacy, and the issue is cropping back up again with Horizon Forbidden West. I'm talking about VRAM in modern GPUs. At this point, you're much better off saving up for a more expensive GPU, waiting until the next generation arrives, or digging deep on last-gen options.
Why are you buying a new GPU?
If you pay attention to PC hardware reviews -- particularly the YouTube megamind of reviewers -- you probably already have a sour taste in your mouth for 8GB graphics cards. I get it. I don't agree that 8GB GPUs are completely obsolete, however.

Read more