Samsung has just announced its new adventure into HBM technology, with the introduction of HBM-PIM with the 'PIM' standing for processing-in-memory. But what the hell does that mean?
The new HBM-PIM technology allows for some programmability inside of the memory layer, thanks to an embedded "DRAM-optimized" AI engine inside of the memory banks. This new AI engine is called a PCU or Programmable Compute Unit, which will handle the data between your CPU and memory in a parallelized way.
This is because most computing systems of today are based on the von Neumann architecture, which separates processor and memory units to do millions of individual processing tasks. This can be congested as you can imagine, so HBM-PIM provides processing power directly stored on a DRAM-optimized AI engine inside of each memory bank.
In testing of Samsung's new HBM-PIM on its current Aquabolt HBM2 memory the company saw performance doubling while power consumption dropped over 70% which is impressive for the early testing.
Kwangil Park, senior vice president of Memory Product Planning at Samsung Electronics said: "Our groundbreaking HBM-PIM is the industry's first programmable PIM solution tailored for diverse AI-driven workloads such as HPC, training and inference. We plan to build upon this breakthrough by further collaborating with AI solution providers for even more advanced PIM-powered applications".
Rick Stevens, Argonne's Associate Laboratory Director for Computing, Environment and Life Sciences said: "I'm delighted to see that Samsung is addressing the memory bandwidth/power challenges for HPC and AI computing. HBM-PIM design has demonstrated impressive performance and power gains on important classes of AI applications, so we look forward to working together to evaluate its performance on additional problems of interest to Argonne National Laboratory".
- > NEXT STORY: Burning Crusade confirmed for World of Warcraft Classic
- < PREVIOUS STORY: Breath of the Wild 2 news coming sometime in 2021, Nintendo says