SK hynix CEO announces new vision as Full Stack AI Memory Creator: custom HBM, AI DRAM, AI NAND

SK hynix CEO outlines updated vision of being the Full Stack AI Memory Creator, with custom HBM, AI DRAM (AI-D), and AI NAND (AI-N) of the future.

SK hynix CEO announces new vision as Full Stack AI Memory Creator: custom HBM, AI DRAM, AI NAND
Comment IconFacebook IconX IconReddit Icon
Gaming Editor
Published
3 minutes & 15 seconds read time
TL;DR: SK hynix CEO Kwak Noh-Jung unveiled the "Full Stack AI Memory Creator" vision at the SK AI Summit 2025, emphasizing collaboration to overcome AI memory challenges. SK hynix aims to lead AI memory innovation with custom HBM, AI-optimized DRAM, and AI NAND, addressing the growing demand for high-performance semiconductor memory in AI computing.

SK hynix CEO Kwak Noh-Jung has just announced the company's new vision of its "Full Stack AI Memory Creator" at the SK AI Summit 2025 event, hosted in Seoul, South Korea, on November 3.

SK hynix CEO announces new vision as Full Stack AI Memory Creator: custom HBM, AI DRAM, AI NAND 708

SK hynix is in a fierce memory battle with fellow South Korean memory maker Samsung, in providing the most -- and the best -- HBM memory for AI chips to companies like NVIDIA and AMD. SK hynix dominated the HBM3 and HBM3E supply to NVIDIA for its Hopper and Blackwell GPUs, but Samsung is catching up quite fast when it comes to next-gen HBM4, so SK hynix is jumping out ahead highlighting its new Full Stack AI Memory Creator vision.

SK hynix CEO Kwak Noh-Jung said: "SK hynix has been playing the role of a Full Stack Memory Provider by supplying products aligned with customer needs and time. Moving forward, the company aims to exceed customers' expectation by actively collaborating within the ecosystem and by solving the customers' hurdles together. We will become a creator who builds "Full Stack AI Memory" as a co-architect, partner, and eco-contributor".

The position and importance of semiconductor memory in the AI era

  • As AI adoption accelerates, the data traffic is exploding, and hardware technologies to support it needs to develop rapidly.
  • However, memory performance is not keeping pace with processor advancements and various measures are being applied to solve this hurdle known as the "Memory Wall".
  • With the growing importance of semiconductor memory in AI performance, it is evolving from an ordinary component into a "core value product" in the AI industry. In accordance, the performance required for semiconductor memory increased significantly, making it difficult to achieve with traditional approaches.

The new vision of SK hynix to prepare for the next AI era

  • Until now, SK hynix has focused on supplying products needed by customers with time-to-market. As a result, the company was able to become a leading global company as a "Full Stack AI Memory Provider".
  • However, as the importance of semiconductor memory grows, we believe the role of a provider will not fulfill the market needs. The new goal for the SK hynix is a "Full Stack AI Memory Creator".
  • "Creator" means that we will solve the customers' current challenges together, and furthermore, we will exceed customers' needs through active collaboration within the ecosystem. To this end, the company will build a "Full Stack AI Memory" as a co-architect, partner, and eco-contributor in AI computing.

Full Stack AI Memory lineup

  • While memory solutions to date have been centered on computing, in the future, it will evolve in ways to diversify and expand the role of memory - enabling more efficient use of computing resources and structurally resolving AI inference bottlenecks. New memory solutions may include SK hynix's Custom HBM, AI DRAM (AI-D), and AI NAND (AI-N).
  • (Custom HBM) As the scope of the AI market is expanding from commodity to inference efficiency and optimization, HBM is also evolving into custom products from conventional products. Custom HBM is a product that integrates certain functions in GPU and ASIC to HBM base to reflect customer needs. This can maximize the performance of GPUs and ASICs, and reduce data transfer power consumption with HBM thereby enhancing system efficiency.
  • (AI-D) DRAM has developed with focus on commodity and compatibility. However, SK hynix is now further segmenting the DRAM to prepare memory solutions best suited to the needs of each segment.
  • First, the company is preparing "AI-D O (Optimization)", a low-power, high-performance DRAM that helps reduce the total cost of ownership and improve operational efficiency. Second, to overcome the Memory Wall, the company is developing "AI-D B (Breakthrough)2", a solution featuring ultra-high-capacity memory with flexible memory allocation. Finally, from the perspective of expanding applications, it is preparing "AI-D E (Expansion)3" to extend DRAM use cases into fields including robotics, mobility and industrial automation.