Skip to main content

Samsung begins mass production of high-bandwidth HBM2 for next-gen video cards

Although we’ve been promised a lot from Nvidia concerning its next-generation Pascal GPU, and from AMD concerning its Polaris GPU, part of that will come down to the benefits of High Bandwidth Memory (HBM) and its sequel, HBM2. As we saw with AMD’s Fury cards though, insufficient stock of the new standard can lead to GPU shortages, and that’s why it’s good news to hear that Samsung is ramping up HBM 2 manufacturing.

Samsung is now mass producing 4GB HBM2 chips, which will be vitally important for the next generation of GPUs from both big-name makers. The 20nm process each of them employs will stack as many as four 8Gb core dies together, each providing as much as 256Gbps of bandwidth — more than five times what’s offered by similar dies of GDDR5 (as per TechReport).

Recommended Videos

That’s how much of a performance leap this new memory standard offers.

And that’s why it’s so important that production is high enough to cater to the expected high demand. Both Nvidia and AMD have been hyping the new-generation GPUs as game changers, with explosive performance and much better energy efficiency.

This tech isn’t the end of the road though. Samsung is also talking up possible 8GB chips of HBM2 by the end of the year, which would open up graphics cards with much more memory than even the entire system has to hand. It will also mean much smaller graphics cards, with those modules offering as much as a 95-percent space savings compared to the GDDR5 standard.

Of course it’s not just graphics cards which stand to benefit from the new memory chips. Supercomputers, servers, and data centers could all see big speed boosts and energy efficiency improvements thanks to the new standard.

Considering all of the space savings, it seems very likely that it won’t be long before HBM2 shows up in smartphones and tablets too, making for some very interesting new hardware developments.

Jon Martindale
Jon Martindale is a freelance evergreen writer and occasional section coordinator, covering how to guides, best-of lists, and…
9 macOS Sequoia features every Mac user should know
macOS Sequoia being introduced by Apple's Craig Federighi at the Worldwide Developers Conference (WWDC) 2024.

Apple’s macOS Sequoia operating system launched with a whole heap of interesting new features, and there’s a lot to try if you’ve just recently updated your Mac. But which new additions are worth your time, and which can be passed over?

That’s the question we’re aiming to answer today. We’ve scoured macOS Sequoia to find the nine key features that every Mac user should know about. From Apple Intelligence to iPhone Mirroring, these are the tools and technologies that you’ll want to try next.

Read more
Meta is training AI on your data. Users say opting out doesn’t work.
Meta AI WhatsApp widget.

Imagine a tech giant telling you that it wants your Instagram and Facebook posts to train its AI models. And that too, without any incentive. You could, however, opt out of it, as per the company. But as you proceed with the official tools to back out and prevent AI from gobbling your social content, they simply don’t work. 

That’s what users of Facebook and Instagram are now reporting. Nate Hake, publisher and founding chief of Travel Lemming, shared that he got an email from Meta about using his social media content for AI training. However, the link to the opt-out form provided by Meta doesn’t work.

Read more
Your politeness toward ChatGPT is increasing OpenAI’s energy costs 
ChatGPT's Advanced Voice Mode on a smartphone.

Everyone’s heard the expression, “Politeness costs nothing,” but with the advent of AI chatbots, it may have to be revised.

Just recently, someone on X wondered how much OpenAI spends on electricity at its data centers to process polite terms like “please” and “thank you” when people engage with its ChatGPT chatbot.

Read more