Skip to main content

Samsung adds A.I. processing to double high-bandwidth memory speeds

High-bandwidth memory, or HBM, is already fast. But Samsung wants to make it even faster. The South Korean-based technology giant has announced its HBM-PIM architecture, which will double the speeds of high-bandwidth memory by leaning on artificial intelligence.

PIM, which stands for processor-in-memory, leverages the capabilities of artificial intelligence to speed up memory, and Samsung hopes that its HBM-PIM tech will be used in applications such as data centers and high-performance computing (HPC) machines in the future.

“Our groundbreaking HBM-PIM is the industry’s first programmable PIM solution tailored for diverse A.I.-driven workloads such as HPC, training, and inference,” Kwangil Park, Samsung Electronics senior vice president of memory product planning, said in a statement. “We plan to build upon this breakthrough by further collaborating with A.I. solution providers for even more advanced PIM-powered applications.”

A potential client for Samsung’s HBM-PIM is the Argonne National Laboratory, which hopes to use the technology to solve “problems of interest.” The lab noted that HBM-PIM addresses the memory bandwidth and performance challenges for HPC and AI computing by delivering impressive performance and power gains.

According to Samsung, the HBM-PIM works by placing a DRAM-optimized engine inside each memory bank in a storage subunit to enable parallel processing to minimize data movement.

“When applied to Samsung’s existing HBM2 Aquabolt solution, the new architecture is able to deliver over twice the system performance while reducing energy consumption by more than 70%,” the company stated. “The HBM-PIM also does not require any hardware or software changes, allowing faster integration into existing systems.”

This is different than existing applications, which are all based on the von Neumann architecture. In current solutions, a separate processor and separate memory units are needed to carry out all the data processing tasks in a sequential approach. This requires data to travel back and forth, often resulting in a bottleneck when handling large data volumes.

By removing the bottleneck, Samsung’s HBM-PIM can be a useful tool in a data scientist’s arsenal. Samsung claims that the HBM-PIM is now being tested by A.I. accelerators by leading A.I. solution partners, and validation is expected to be completed in the first half of 2021.

Editors' Recommendations

Chuong Nguyen
Silicon Valley-based technology reporter and Giants baseball fan who splits his time between Northern California and Southern…
Get ready for more smart cooking and cleaning with Samsung’s Bixby A.I.
samsung galaxy note 8 review hands on

It turns out that Bixby really is the future that Samsung wants. The South Korean tech giant is continuing its overarching goal of embedding the virtual assistant into all of its products by 2020, announcing this week that the personalized service will soon be added to all of Samsung’s robot cleaners and ovens. The company’s artificial intelligence will be included in all of its cleaners and ovens by the end of this year, in fact.

HS Kim, president and CEO of Samsung’s Consumer Electronics division, said at a press event in Seoul that the A.I. assistant will continue to evolve into an intelligent, personalized service over time. The technology is an enormous investment for Samsung, which has brought together expertise in A.I. and voice recognition, the company’s Samsung Connect connectivity platform, and a cloud stack.

Read more
The new norm in smartphones could be 8GB of memory, thanks to Samsung
tmobile galaxy s7 bogo samsung 0015 800x533 c

With Samsung in the midst of the biggest smartphone debacle of the past few years, which involved a complete recall of the Note 7, it's easy to forget that the company has also been working on new memory chips. These new chips combine a 10nm chipset with 8GB of LPDDR4, making them the most compact, high-capacity memory chips in the world, with the potential to bring huge volatile storage to smartphones.

While the world and his dog might know Samsung as one of the most prominent smartphone manufacturers, it has its fingers in a lot of pies. One of them is memory chip development, and its latest creation could very well be the next-generation update we have been looking forward to.

Read more
High-bandwidth mode provides a tighter online experience on 'Overwatch' for PC
blizzard overwatch high bandwidth mode tracer feat

In the latest developer update for Overwatch, lead engineer Tim Ford and senior engineer Philip Orwig talk about the game running in “high-bandwidth mode,” a server-side feature that is now provided to most of the Overwatch community on PCs.

“Not everybody has a great network connection,” Orwig said. “Part of the tech on rolling this out was we had to enable a mechanism to adaptively figure out whether or not your connection at home can actually handle the appropriate amount of packets coming in. So, as part of the feature rollout, on the server, we’ll dynamically figure out whether or not you’re syncing the packets appropriately and downgrade you or upgrade you as necessary in order to make sure you still have a quality experience no matter the quality of your connection.”

Read more