Samsung will reportedly show off its new HBM3E 12-Hi stack memory at NVIDIA's upcoming GPU Technology Conference (GTC) between March 18 and 21.

NVIDIA is a major partner for Samsung's bleeding-edge HBM memory, part of the heart and soul of its AI GPUs coming soon in the form of the Hopper H200 (a beefed-up HBM3E-based version of H100) and its next-gen Blackwell B100 AI GPU that are both featuring HBM3E memory.
Samsung will exhibit its new high-capacity 36GB HBM3E 12-Hi stack memory in physical form at NVIDIA GTC 2024. Samsung is expected to begin mass production in the coming months, with NVIDIA as a major customer for its H200 and B100 AI GPUs.
HBM competitor and South Korean rival SK hynix recently provided NVIDIA with samples of its new 36GB HBM3E 12-Hi stack memory. Meanwhile, Samsung will have its HBM3E 12-Hi memory physically on display to the world at GTC 2024.
- Read more: Samsung teases industry-first 36GB HBM3e 12-Hi memory stack, coming soon
- Read more: Samsung hoarding 2.5D packaging equipment for NVIDIA's next-gen AI GPU
- Read more: SK hynix and Samsung are both sold out of their HBM3 memory until 2025
- Read more: Samsung's new HBM3e 'Shinebolt' memory: 50% perf boost for AI GPUs
NVIDIA is expected to push -- even harder -- into the world of AI at its upcoming GPU Technology Conference. We are expecting to hear all about its upgraded H200 AI GPU with HBM3E memory, and the full unveiling of its next-generation Blackwell GPU architecture, and the new B100 AI GPU that will take the #1 position as the fastest AI GPU on the planet.



