Skip to main content

Samsung enters GPU tech licensing talks with AMD and Nvidia

Samsung Galaxy Note 7
Julian Chokkattu/Digital Trends
Over the past several years, Samsung has emerged as one of the biggest players in the smartphone market — in July, it was reported that the Galaxy S7 had managed to outsell the iPhone 6S for a three-month period. Now, there’s word that the company is preparing a change to its manufacturing procedure that could help secure an even more dominant position.

Samsung is apparently pursuing a longstanding desire to develop a GPU in-house for use with its mobile processors. The company is in talks with both AMD and Nvidia with a view to licensing its GPU technologies, according to a report from Sam Mobile.

As far back as 2014, there were rumors that Samsung had hired engineers from companies like AMD, Nvidia, and Intel in order to accelerate its plan for internal GPU development. At one point, it was expected that the firm would be able to implement its Exynos chipset with a proprietary GPU for the Galaxy Note 5 — but this never came to pass.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

At present, Samsung uses the Mali series of GPUs developed by ARM for its Exynos chipsets. However, based on these talks with AMD and Nvidia, it seems that the current arrangement may come to an end sooner rather than later.

Shifting to internal development of GPUs would allow Samsung to cut costs accrued outsourcing the components, even allowing for the licensing fees that would need to be paid to whichever company’s tech is being used. It remains to be seen whether the end goal is simply larger profit margins, or the ability to undercut the pricing of its rivals.

Given that the company is still in talks with AMD and Nvidia, the situation is still very fluid. However, this weekend’s reporting indicates that Nvidia is currently the front-runner thanks to the strength of its Pascal architecture.

Editors' Recommendations

Brad Jones
Former Digital Trends Contributor
Brad is an English-born writer currently splitting his time between Edinburgh and Pennsylvania. You can find him on Twitter…
Meet Blackwell, Nvidia’s next-generation GPU architecture
Nvidia introducing its Blackwell GPU architecture at GTC 2024.

We finally have our first taste of Nvidia's next generation of GPUs, named Blackwell. Sure, they're built for enterprises, and no, they won't run Cyberpunk 2077 (at least not officially). But this is the first look we've have at what Nvidia is cooking up for its RTX 50-series GPUs, which are rumored to launched sometime in the next year.

The GPU we have today is the B200 -- Blackwell 200, if you can spot it -- that comes packed with 208 billion transistors. The architecture is built on TSMC's 4NP node, which is an enhanced version of the 5nm node. It's a little surprising given that Nvidia's Ada Lovelace GPUs are built with TSMC's 4N node -- one refinement step away from 4NP. Nvidia notes that it's using a custom version of this process, however.

Read more
This new GPU feature is ‘a whole new paradigm’ for PC gaming
RX 7900 XTX slotted into a test bench.

Microsoft has released its Agility SDK 1.613.0, which features some critical components that will be shown to developers at the Game Developers Conference (GDC) in San Francisco next week. The most interesting component is Work Graphs, which Microsoft describes as "a whole new paradigm" for graphics cards.

Work Graphs enable GPU-driven work. Normally when you're playing a PC game, there's a relationship between your GPU and CPU. Your CPU gets work ready and sends it to your GPU, and then your GPU executes that work. Work Graphs is an approach that allows your GPU to schedule and execute its own tasks, which has some massive implications for performance.

Read more
AMD is finally taking FreeSync to the next level
Two monitors with AMD FreeSync over a dark background.

It took a long time, but AMD has just updated its FreeSync adaptive sync technology requirements, and it was a much-needed change. Previously, the base tier of FreeSync didn't have any refresh rate requirements that monitors had to meet. Now, AMD didn't just add a requirement, but it's pretty massive -- and that's great news for the future of gaming monitors.

When AMD first introduced FreeSync in 2015, the vast majority of gamers and casual users alike were using a 60Hz monitor. While screens with higher refresh rates existed, they were a rarity. That's no longer the case today, and almost all of the top monitors, regardless of their price, offer refresh rates of over 120Hz.

Read more