SK hynix is the leader in AI memory with its next-gen HBM4 and even next-gen-er HBM4E memory is being promised for later this year, with mass production of its new 12-layer HBM4 promised at the same time.

Choi Jun-yong, Vice President of the HBM (High Bandwidth Memory) Business Planning at SK hynix, recently said: "We will further solidify our HBM leadership by not only mass producing (6th generation HBM) HBM4 12-layer this year, but also timely supplying (7th generation) HBM4E".
In an interview with VP Choi that day in the SK hynix newsroom, the executive said: "In parallel with the development of new HBM, we will provide optimal solutions to various customer needs through custom HBM that meets the specialized needs of customers".
He added: "We have been preparing for the HBM market for a long time and patiently, as all members silently took on the challenge with a one-team spirit, I believe we have created the opportunity to successfully develop HBM4. We will do our best to preemptively prepare for the market and make optimized business plans".
- Read more: SK hynix ships world's first 12-layer HBM4 samples to customers, ready for NVIDIA Rubin AI GPUs
SK hynix recently began shipping its first 12-layer HBM4 memory chip samples, where the company explained: "The samples were delivered ahead of schedule based on SK hynix's technological edge and production experience that have led the HBM market, and the company is to start the certification process for the customers. SK hynix aims to complete preparations for mass production of 12-layer HBM4 products within the second half of the year, strengthening its position in the next-generation AI memory market".
"The 12-layer HBM4 provided as samples this time feature the industry's best capacity and speed which are essential for AI memory products. The product has implemented bandwidth capable of processing more than 2TB (terabytes) of data per second for the first time. This translates to processing data equivalent to more than 400 full-HD movies (5GB each) in a second, which is more than 60 percent faster than the previous generation, HBM3E"".
- Read more: SK hynix hits 70% yield on its new HBM4 12-Hi memory: ready for NVIDIA's next-gen Rubin AI GPUs
"SK hynix also adopted the Advanced MR-MUF process to achieve the capacity of 36GB, which is the highest among 12-layer HBM products. The process, of which competitiveness has been proved through a successful production of the previous generation, helps prevent chip warpage, while maximizing product stability by improving heat dissipation. Following its achievement as the industry's first provider to mass produce HBM3 in 2022, and 8- and 12-high HBM3E in 2024. SK hynix has been leading the AI memory market by developing and supplying HBM products in a timely manner".
Justin Kim, President & Head of AI Infra at SK hynix said: "We have enhanced our position as a front-runner in the AI ecosystem following years of consistent efforts to overcome technological challenges in accordance with customer demands. We are now ready to smoothly proceed with the performance certification and preparatory works for mass production, taking advantage of the experience we have built as the industry's largest HBM provider".