Samsung develops HBM with added AI processing power

by Mark Tyson on 17 February 2021, 12:11

Tags: Samsung (005935.KS)

Quick Link: HEXUS.net/qaep7l

Add to My Vault: x

Samsung has developed a new kind of memory product which it calls HBM-PIM. HEXUS regulars will be familiar with the first half of that abbreviation, HBM = High Bandwidth Memory, but it is worth spelling out that PIM is short for Processing-In-Memory.

"Our groundbreaking HBM-PIM is the industry's first programmable PIM solution tailored for diverse AI-driven workloads such as HPC, training and inference," said SVP of Memory Product Planning at Samsung Electronics, Kwangil Park. "We plan to build upon this breakthrough by further collaborating with AI solution providers for even more advanced PIM-powered applications".

Why bother making HBM-PIM? The idea is to bring powerful AI computing capabilities inside high-performance memory and Samsung says this integration will benefit large-scale processing in data centres, high performance computing (HPC) systems, and AI-enabled mobile applications. It explains that most of today's computer systems are based upon the von Neumann architecture, with separate processor and memory but sequential processing in this architecture means a constant back and forth. Instead, HBM-PIM implements "a DRAM-optimized AI engine inside each memory bank — a storage sub-unit — enabling parallel processing and minimizing data movement".

The practical benefits are quite startling. Applied to Samsung's existing HBM2 Aquabolt solution, the new architecture enables double the system performance while reducing energy consumption by 70 per cent in use cases Samsung tested. Interestingly, Samsung adds that "HBM-PIM does not require any hardware or software changes, allowing faster integration into existing systems".

One of the first places to test Samsung HBM-PIM will be the U.S. Department of Energy’s Argonne National Laboratory, home of several supercomputers including the Mira and Theta, with the first exascale system in the US (Aurora) due to be delivered later this year.



HEXUS Forums :: 3 Comments

Login with Forum Account

Don't have an account? Register today!
Interesting.
Even if it is probably not something that will benefit me as a general user.
If that part is programmable, wouldn't it be yet another security risk ?
Gentle Viking;1338941
If that part is programmable, wouldn't it be yet another security risk ?

Only on an Intel based platform :P
Smart memory.
This is the obvious thing to do.
Neurons in brains don't have this hard division of compute vs storage.
And with pathways and interconnects using more and more of a processor's power budget, moving compute to memory avoids this.