Samsung teases industry-first 36GB HBM3e 12-Hi memory stack, coming soon

Samsung unveils the industry's largest capacity HBM, with its new 36GB HBM3e 12H memory offering more performance, and more capacity for AI GPUs.

Published
Updated
2 minutes & 7 seconds read time

Samsung has just announced it completed the development of its new 12-Hi 36GB HBM3e memory stacks; on the heels of Micron's announcement, it's started mass production of its 8-Hi 24GB HBM3e memory... what an announcement.

Samsung teases industry-first 36GB HBM3e 12-Hi memory stack, coming soon 811

Samsung's new codename Shinebolt HBM3e memory features 12-Hi 36GB HBM3e stacks with 12 x 24Gb memory devices placed on a logic die featuring a 1024-bit memory interface. Samsung's new 36GB HBM3e memory modules feature 10GT/s transfer rates, offering next-gen AI GPUs up to 1.28TB/sec of memory bandwidth per stack, the industry's highest per-device (or per-module) memory bandwidth.

Yongcheol Bae, Executive Vice President of Memory Product Planning at Samsung Electronics, said in the press release: "The industry's AI service providers are increasingly requiring HBM with higher capacity, and our new HBM3E 12H product has been designed to answer that need. This new memory solution forms part of our drive toward developing core technologies for high-stack HBM and providing technological leadership for the high-capacity HBM market in the AI era".

Samsung is using multiple advanced technologies for its new Shinebolt 12-Hi 36GB HBM3e memory stacks, with the 36GB HBM3e memory products based on memory devices using Samsung's 4th generation 10nm-class (14nm) fabrication technology, which is called and uses extreme ultraviolet (EUV) lithography.

Samsung uses its advanced thermal compression non-conductive film (TC NCF) which allows Samsung to achieve the industry's smallest gap between memory devices at seven micrometers (7 µm). Samsung increases vertical density and mitigates chip die warping, thanks to shrinking the gaps between DRAMs.

The company is estimating its new 12-Hi 36GB HBM3e modules can increase the average speed for AI training by up to 34%, while increasing the amount of simultaneous users of inference services by a whopping 11.5x, but the company didn't detail the size of the LLM here.

Samsung is already sampling its new 12-Hi 36GB HBM3e memory modules to customers, with mass production scheduled for the first half of 2024.

Buy at Amazon

NVIDIA H100 80 GB Graphic Card PCIe HBM2e Memory 350W

TodayYesterday7 days ago30 days ago
Buy at Newegg
$139.99$139.99$139.99
$30099.99$29949.95$30099.99
* Prices last scanned on 5/7/2024 at 8:46 pm CDT - prices may not be accurate, click links above for the latest price. We may earn an affiliate commission.

Anthony joined the TweakTown team in 2010 and has since reviewed 100s of graphics cards. Anthony is a long time PC enthusiast with a passion of hate for games built around consoles. FPS gaming since the pre-Quake days, where you were insulted if you used a mouse to aim, he has been addicted to gaming and hardware ever since. Working in IT retail for 10 years gave him great experience with custom-built PCs. His addiction to GPU tech is unwavering and has recently taken a keen interest in artificial intelligence (AI) hardware.

Newsletter Subscription

Related Tags