Samsung has finally inked its important HBM memory deal with NVIDIA, where it will supply its next-gen HBM4 memory with up to 11Gbps HBM4 memory chips to use on NVIDIA's next-gen Rubin AI GPUs in 2026.

After what has felt like forever in trying to get its HBM3 and HBM3E memory certified for use by NVIDIA, Samsung and NVIDIA have gotten closer than ever, working on HBM4 together as well as working on a new AI factory that's powered by 50,000+ of NVIDIA's AI GPUs.
- Read more: NVIDIA + Samsung working on new semiconductor AI factory, with 50,000+ GPUs
- Read more: Samsung readying mass production of next-gen HBM4 memory in 2026
- Read more: NVIDIA asked for 9Gbps HBM4, then 10-11Gbps: Samsung HBM4 ready at 10Gbps+
- Read more: Samsung's new 1c DRAM yields improve: new chairman admits prior mistakes, ready for HBM4
Samsung and NVIDIA are working together on HBM4, with Samsung explaining on its press release that with incredibly high bandwidth and energy efficiency, Samsung's advanced HBM solutions are expected to help accelerate the development of future AI applications, and form a critical foundation for manufacturing infrastructure driven by these technologies.
The company is using its 6th-generation 10nm-class DRAM and a 4nm logic base die, with Samsung's upcoming HBM4 processing speeds reaching up to 11Gbps, far exceeding the JEDEC standard for HBM4 at 8Gbps. Samsung will also continue to deliver next-generation memory solutions, including HBM, GDDR, and SOCAMM memory, as well as foundry services, driving innovation and scalability across the global AI value chain.




