SK hynix has announced volume production of its new 12-layer HBM3E memory, with up to 36GB capacities and speeds of 9.6Gbps.
The South Korean memory leader announced it has started mass production of the world's first 12-layer HBM3E memory with 36GB, the largest capacity of existing HBM to date. SK hynix plans to supply mass-produced 12-layer HBM3E memory chips to companies (NVIDIA) within the next 12 months, and only 6 months after launching 8-layer HBM3E to customers for the first time in the industry in March 2024.
SK hynix is the key to the world of AI chips, with NVIDIA using its HBM3 and HBM3E memory inside of its Hopper H100 and H200 AI GPUs, with HBM3E also used in its new Blackwell AI GPUs. SK hynix has been leading the industry with HBM, with its new 12-layer HBM3E memory chips boosted up to 9.6Gbps of bandwidth, the highest memory speed on the market.
SK hynix says that its new 12-layer HBM3E memory at 9.6Gbps bandwidth if running a Llama 3 70B LLM and a single AI GPU with 4 HBM3E products, it can read 70 billion total parameters 35 times within a single second. Not too shabby at all, SK hynix.
Justin Kim, President of AI infrastructure at SK hynix said: "SK hynix has once again broken through technological limits demonstrating our industry leadership in AI memory. We will continue our position as the No.1 global AI memory provider as we steadily prepare next-generation memory products to overcome the challenges of the AI era".