Micron has updated the industry on its next-generation HBM4 and HBM4E memory, with mass production expected to start in 2026 ready for next-gen AI GPUs like the NVIDIA Rubin R100.

The AI memory market only has a few players with SK hynix leading the HBM market, Samsung and Micron have started spooling up production of their HBM3, HBM3E, and future-gen HBM4 and HBM4E memory for the insatiable requirements of the AI industry... it's mostly NVIDIA vacuuming up all HBM memory chips made.
Sanjay Mehrotra, President and Chief Executive Officer of Micron said: "Leveraging the strong foundation and continued investments in proven 1β process technology, we expect Micron's HBM4 will maintain time to market and power efficiency leadership while boosting performance by over 50% over HBM3E. We expect HBM4 to ramp in high volume for the industry in calendar 2026".
- Read more: Micron samples 12-Hi HBM3E: up to 36GB capacity: 9.2Gbps speeds, 1.2TB/sec
- Read more: SK hynix, Samsung, Micron expanding HBM production for 2025
- Read more: Micron expands HBM production in the US, considers Malaysian HBM production
- Read more: Micron's share of HBM market expected to hit 20-25% by August 2025
- Read more: Micron's entire HBM supply sold out for 2024, and a majority of 2025 supply already allocated
Micron added: "HBM4E will introduce a paradigm shift in the memory business by incorporating an option to customize the logic base die for certain customers using an advanced logic foundry manufacturing process from TSMC. We expect this customization capability to drive improved financial performance for Micron".