Micron is taking a big HBM memory chip business opportunity according to new reports, where the US-based company plans to secure 20-25% of the HBM3E market share by August 2025.
HBM is one of the hottest markets right now thanks to the insatiable demand for more and more AI GPU computing power, with HBM makers like SK hynix, Samsung, and Micron all fighting over each other to be the best. As it stands, SK hynix is far out in the lead, but Micron is making progress, and so too is Samsung which is now working with AMD on providing its HBM3E for their new Instinct MI325X AI accelerator which sports 288GB of HBM3E memory.
Micron said the progress in its HBM3E could be contributed through its advanced packaging and design capabilities, as well as the integration of its own processes. Micron is also working on next-generation HBM4 memory, which we'll see debut in 2025 and beyond for next-gen AI GPUs.
Micron is also expanding across the world, where it's invested $5.1 billion in a new DRAM memory chip fab that will be in Japan. This year, the company is completely sold out of its HBM capacity, which is expected to add hundreds of millions of dollars in revenue this year.
- Read more: Micron samples 32Gbps GDDR7 for next-gen GPUs: over 1.5TB/sec bandwidth
- Read more: Micron to invest $5.1 billion in new DRAM memory chip fab in Japan by 2027
- Read more: Micron to get $6.1 billion in US funding for chip plants in New York, Idaho
- Read more: Micron's entire HBM supply sold out for 2024, and a majority of 2025 supply already allocated
At Computex 2024 this week, Micron also announced it is now sampling its new 32Gbps GDDR7 memory for next-gen GPUs, capable of over 1.5TB/sec of memory bandwidth.