SK hynix showed off its new HBM3E 12-Hi memory alongside NVIDIA products at the OCP Global Summit last week, with a range of AI memory semiconductor tech and products teased, with SK hynix aiming at leading the future semiconductor market.

During the event, SK hynix showed off some of its AI memory products, including its new HBM3E 12-Hi stack memory which it started mass-producing in September, marking a significant milestone in the evolution of High Bandwidth Memory.
SK hynix's new HBM3E 12-Hi stack was showed off with NVIDIA's new H200 AI GPU and GB200 Grace Blackwell Superchip, with a huge 36GB capacity achieved by making the DRAM chips 40% thinner. This innovation has paved the way for a 12-stack HBM3E memory configuration at the same thickness as the previous HBM3E 8-Hi stack.
- Read more: SK hynix HBM3E chip yield hits 80% which has help cut mass production times down by 50%
- Read more: SK hynix to spend $74.6B on memory business, $58B billion on AI, semiconductors
- Read more: NVIDIA, TSMC, SK hynix form 'triangular alliance' for next-gen AI GPUs + HBM4 memory
- Read more: SK hynix: HBM3E expected to make up over half of HBM shipments in 2024
- Read more: SK hynix: most of its HBM for 2025 sold out already, 16-Hi HBM4 coming in 2028
Not only that, but the beefed-up HBM3E 12-Hi stack data processing speed has been increased to 9.6Gbps, with SK hynix applying its core technology -- the advanced MR-MUF process -- to further improve heat dissipation by 10%, with Business Korea noting it earns SK hynix's new HBM3E 12-Hi stack memory the title of the world's best in speed, capacity, and stability.