Samsung has just announced it completed the development of its new 12-Hi 36GB HBM3e memory stacks; on the heels of Micron's announcement, it's started mass production of its 8-Hi 24GB HBM3e memory... what an announcement.
Samsung's new codename Shinebolt HBM3e memory features 12-Hi 36GB HBM3e stacks with 12 x 24Gb memory devices placed on a logic die featuring a 1024-bit memory interface. Samsung's new 36GB HBM3e memory modules feature 10GT/s transfer rates, offering next-gen AI GPUs up to 1.28TB/sec of memory bandwidth per stack, the industry's highest per-device (or per-module) memory bandwidth.
Yongcheol Bae, Executive Vice President of Memory Product Planning at Samsung Electronics, said in the press release: "The industry's AI service providers are increasingly requiring HBM with higher capacity, and our new HBM3E 12H product has been designed to answer that need. This new memory solution forms part of our drive toward developing core technologies for high-stack HBM and providing technological leadership for the high-capacity HBM market in the AI era".
- Read more: Samsung hoarding 2.5D packaging equipment, preps for NVIDIA's next-gen AI GPU
- Read more: SK hynix and Samsung are both sold out of their HBM3 memory until 2025
- Read more: Samsung's new HBM3e 'Shinebolt' memory: 50% perf boost for AI GPUs
Samsung is using multiple advanced technologies for its new Shinebolt 12-Hi 36GB HBM3e memory stacks, with the 36GB HBM3e memory products based on memory devices using Samsung's 4th generation 10nm-class (14nm) fabrication technology, which is called and uses extreme ultraviolet (EUV) lithography.
Samsung uses its advanced thermal compression non-conductive film (TC NCF) which allows Samsung to achieve the industry's smallest gap between memory devices at seven micrometers (7 µm). Samsung increases vertical density and mitigates chip die warping, thanks to shrinking the gaps between DRAMs.
The company is estimating its new 12-Hi 36GB HBM3e modules can increase the average speed for AI training by up to 34%, while increasing the amount of simultaneous users of inference services by a whopping 11.5x, but the company didn't detail the size of the LLM here.
Samsung is already sampling its new 12-Hi 36GB HBM3e memory modules to customers, with mass production scheduled for the first half of 2024.