SK hynix has started sampling its next-gen 12-Hi HBM4 memory modules to customers -- namely NVIDIA, which will be using HBM4 on its new Rubin AI GPUs -- the world's first HBM4.

During NVIDIA's recent GTC 2025 event, AI memory partner SK hynix announced it was unveiling its new 12-Hi HBM4, 12-Hi HBM3E (for NVIDIA's new GB300 AI GPU), and new SOCAMM memory modules.
SK hynix's' new HBM4 memory uses its Advanced MR-MUF process, pushing capacities up to 36GB which makes it the highest amongst 12-Hi HBM products. We know that NVIDIA's new Rubin and Rubin Ultra AI GPU platforms will use HBM4 memory, thanks to its high speeds we have up to 2TB/sec for the first time, a 60% performance jump over HBM3E
SK hynix said: "The samples were delivered ahead of schedule based on SK hynix's technological edge and production experience that have led the HBM market, and the company is to start the certification process for the customers. SK hynix aims to complete preparations for mass production of 12-layer HBM4 products within the second half of the year, strengthening its position in the next-generation AI memory market".

"The 12-layer HBM4 provided as samples this time feature the industry's best capacity and speed which are essential for AI memory products. The product has implemented bandwidth capable of processing more than 2TB (terabytes) of data per second for the first time. This translates to processing data equivalent to more than 400 full-HD movies (5GB each) in a second, which is more than 60 percent faster than the previous generation, HBM3E"".
"SK hynix also adopted the Advanced MR-MUF process to achieve the capacity of 36GB, which is the highest among 12-layer HBM products. The process, of which competitiveness has been proved through a successful production of the previous generation, helps prevent chip warpage, while maximizing product stability by improving heat dissipation. Following its achievement as the industry's first provider to mass produce HBM3 in 2022, and 8- and 12-high HBM3E in 2024. SK hynix has been leading the AI memory market by developing and supplying HBM products in a timely manner".
Justin Kim, President & Head of AI Infra at SK hynix said: "We have enhanced our position as a front-runner in the AI ecosystem following years of consistent efforts to overcome technological challenges in accordance with customer demands. We are now ready to smoothly proceed with the performance certification and preparatory works for mass production, taking advantage of the experience we have built as the industry's largest HBM provider".