As an Amazon Associate, we earn from qualifying purchases. TweakTown may also earn commissions from other affiliate partners at no extra cost to you.
SK hynix showed off its next-gen HBM4 memory at TSMC's recent North American Technology Symposium, with up to 16-Hi stacks and 2TB/sec memory bandwidth per stack, ready for NVIDIA's next-gen Vera Rubin AI hardware.

SK hynix showed off both 12-Hi and 16-Hi stacks of HBM4 memory, which feature a capacity of up to 48GB, up to 2TB/sec memory bandwidth, and I/O speeds rated at 8Gbps, with the South Korean memory leader announcing mass production for 2H 2025, and into AI GPUs by the end of this year, flooding the market with HBM4-powered AI GPUs like NVIDIA's next-gen Vera Rubin in 2026.

We will see SK hynix's world-leading HBM4 memory chips inside of NVIDIA's upcoming GB300 "Blackwell Ultra" AI GPUs, with the company planning to shift fully into the arms of HBM4 memory starting with Vera Rubin later this year. SK hynix also pointed out that they've managed the high number of layers through using Advanced MR-MUF and TSV technologies.
- Read more: JEDEC releases HBM4 standard: ready for next-gen AI and HPC memory
- Read more: SK hynix confirms HBM4 = HBM4E memory coming this year for next-gen AI GPUs
- Read more: NVIDIA's next-gen Vera Rubin NVL576 AI server: 576 Rubin AI GPUs, 12672C/25344T CPU, new HBM4
SK hynix also showed off its family of server memory modules, with RDIMM and MRDIMM high-performance server modules now being build based on the latest 1c DRAM standard, which pushes module speeds up to an impressive 12,500MB/sec. SK hynix said: "Notably, SK hynix exhibited a range of modules designed to enhance AI and data center performance while reducing power consumption. These included the MRDIMM lineup with a speed of 12.8 gigabits per second (Gbps) and capacities of 64 GB, 96 GB, and 256 GB; RDIMM modules with a speed of 8 Gbps in 64 GB and 96 GB capacities; and a 256 GB 3DS RDIMM".