Skip to main content

Leaked photo shows upcoming Intel/AMD module on a compact motherboard

Intel/AMD module
Image used with permission by copyright holder
As reported last week, Intel teamed up with AMD to create a multi-chip module (MCM) consisting of Intel’s seventh-generation processor cores, AMD’s previous-generation “Polaris” Radeon graphics cores, and built-in “stacked” memory dedicated to graphics. Renders provided by Intel shows the three components in a rectangular package that plugs into a motherboard. Now a photograph has sufaced of the Intel/AMD module, giving us a real-world glimpse.

Both the photograph and Intel’s official render show the processor cores housed in one chip, and the Radeon graphics cores in another chip parked next to the HBM2 video memory. All three are mounted on a small enclosed circuit board that also contains a dedicated “highway” to quickly pass data between the three components. The module itself is mounted on a small motherboard akin to Intel’s Next Unit of Computing (NUC) all-in-one, small-form-factor PCs.

Speculation stemming from the now-removed photograph pegs the graphics component to possibly be AMD’s Polaris 20 graphics chip used on the Radeon RX 580 graphics card, as both appear similar in size. Meanwhile, the height of the HBM2 memory stack suggest at maximum 4GB capacity, backing up previous benchmark leaks showing two modules sporting 4GB of HDM2 memory each.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

But leaked benchmarks also reveal that the Radeon graphics component reports 24 compute units, with translates into 1,536 stream processors. The Radeon RX 580 consists of 36 compute units (2,304 stream processors), the Radeon RX 570 has 32 compute units (2,048 stream processors), and the RX 560 consists of 16 or 14 compute units, depending on the card. What we’re seeing is the custom Radeon graphics component confirmed in Intel’s official announcement.

That said, the module’s Radeon component should hit a performance level residing between the RX 570 and the RX 560, or in the case of Nvidia, between the GeForce GTX 1050 Ti and the GTX 1060. Adding to that, the Core i7-88809G module will supposedly have a base graphics speed at 1,190MHz, and a memory bus clocked at 800MHz. Meanwhile, the slower Core i7-8705G model will supposedly have a 1,011MHz base graphics speed, and a memory bus clocked at 700MHz.

Outside the pictured module, the motherboard seen in the photograph appears to be similar in size to Intel’s larger NUC kits. Two memory slots reside next to Intel’s upcoming module along with an M.2 slot supporting SSDs measuring 22mm x 80mm. You can also see two SATA controllers, and one side packed full of ports. Two of these are likely “stacked” USB ports, and one possibly serving as HDMI output.

The photograph emerges after Intel revealed that AMD’s former head of its Radeon graphics division, Raja Koduri, is now spearheading a new department at Intel dedicated to high-end graphics. The announcement arrived just after Intel revealed its collaboration to produce modules, and Koduri’s move from AMD to Intel is likely part of that collaboration. Intel’s new “Core and Visual Computing Group” will place the company as a third player in the high-end graphics market, competing with AMD and Nvidia.

Intel’s new modules are slated to be made available for PC makers in the first quarter of 2018.

Editors' Recommendations

Kevin Parrish
Former Digital Trends Contributor
Kevin started taking PCs apart in the 90s when Quake was on the way and his PC lacked the required components. Since then…
The iPhone 15’s chip challenges Intel’s fastest desktop CPU — but there’s a catch
Intel Core i9-13900K held between fingertips.

Who would have thought that some of the best CPUs would face competition not from a desktop or laptop CPU, but from a mobile system-on-a-chip (SoC)? Well, the latest Geekbench 6 scores prove that it's possible. Apple's new A17 Pro chip, announced during the September 2023 Apple event and found in the iPhone 15 Pro and Pro Max, challenges AMD and Intel -- but there's a catch.

Yes, it's real. The Geekbench 6 test gives the A17 Pro chip a score of 2,914 in single-core operations, and that's mighty impressive for something that will end up in a smartphone. However, the generational leap is not that impressive -- the last-gen A16 Bionic chip is only around 10% behind in terms of single-threaded performance. The A17 Pro was built based on TSMC's 3nm technology, while the A16 Bionic is a 5nm chip, also made by TSMC.

Read more
I’m fed up with the AMD vs. Nvidia vs. Intel discourse, and you should be too
AMD RX 6600 among other graphics cards.

The rivalry between AMD, Nvidia, and Intel is exciting to watch at times. Which tech giant makes the best processors? What about graphics cards? This discourse is a bit of fun any PC enthusiast can have online, but too often, it devolves into something that can persuade buyers into bad purchases and push loyal fans into branded camps.

A little bit of competition between all three chipmakers is a good thing, but frankly, the amount of misinformation that stems from it makes life harder for most users. Here's why I'm sick of the never-ending online battle of AMD versus Intel versus Nvidia, and why you should be, too.
Misinformation and bias

Read more
GPU sales figures show that AMD is its own greatest enemy
The AMD RX 6700 XT sitting on a table.

Despite the vast difference in market share, AMD's GPUs continue selling very well -- in some places, they're actually more sought-after than Nvidia's best graphics cards. However, we're not talking about AMD's flagship RX 7900 XTX here. In fact, the latest sales figures show that AMD might be its own fiercest competitor.

As is often the case, today's insight into the GPU market comes from TechEpiphany on Twitter, who reports on the sales numbers for Mindfactory, a German retailer. This week's report shows an interesting trend: while those who prefer Nvidia largely seem to have switched over to the RTX 40-series, AMD enthusiasts are still mostly sticking to last-gen cards. It's really no wonder that shopping for a GPU is such a headache despite the abundance.

Read more