Samsung will debut its next-gen HBM4 memory at NVIDIA GTC 2026 in March, reportedly passing all of NVIDIA's strict verification stages, and will arrive on NVIDIA's next-gen Vera Rubin AI platform.

Samsung has spent the last couple of years struggling with its HBM memory division, leaving its South Korean rival -- SK hynix -- to enjoy providing NVIDIA with all of its HBM3 and HBM3E needs. Samsung completely overhauled its HBM and semiconductor division in the last few years, with the fruits of that labor now showing.
NVIDIA will reportedly use its first allotments of HBM4 memory for Vera Rubin from Samsung, as Samsung's new HBM4 memory is the best of the HBM4 offerings from its rivals in SK hynix and US-based Micron. Samsung's new HBM4 memory is rated for above 11Gbps, much higher than JEDEC standards for HBM4, and was pushed and requested at those higher pin speeds from NVIDIA direct.
- Read more: Samsung lands deal with NVIDIA on HBM4: collaboration on next-gen HBM
- Read more: SK hynix + Samsung + Micron fighting for NVIDIA supply contracts on 16-Hi HBM4
Samsung and NVIDIA are working together on HBM4, with Samsung explaining on its press release that with incredibly high bandwidth and energy efficiency, Samsung's advanced HBM solutions are expected to help accelerate the development of future AI applications, and form a critical foundation for manufacturing infrastructure driven by these technologies.
- Read more: NVIDIA + Samsung working on new semiconductor AI factory, with 50,000+ GPUs
- Read more: Samsung readying mass production of next-gen HBM4 memory in 2026
- Read more: NVIDIA asked for 9Gbps HBM4, then 10-11Gbps: Samsung HBM4 ready at 10Gbps+
- Read more: Samsung's new 1c DRAM yields improve: new chairman admits prior mistakes, ready for HBM4
The company is using its 6th-generation 10nm-class DRAM and a 4nm logic base die, with Samsung's upcoming HBM4 processing speeds reaching up to 11Gbps, far exceeding the JEDEC standard for HBM4 at 8Gbps. Samsung will also continue to deliver next-generation memory solutions, including HBM, GDDR, and SOCAMM memory, as well as foundry services, driving innovation and scalability across the global AI value chain.




