HBM is the key to AI chips of today and more especially, the future, with HBM production capacity at its limits as AI chip makers like NVIDIA, have been scooping it all up.
SK hynix, Samsung, and Micron lead the HBM memory chip side, with HBM3 and HBM3E orders from all three major DRAM manufacturers completely out of stock until the end of 2025. We've been reporting on that side of the industry heavily, but now all three HBM manufacturers -- SK hynix, Samsung, and Micron -- are expanding HBM production capacity to keep up with the continuously growing demand of the AI market.
This means that the demand for application materials outsourcing equipment and components from the three leading DRAM factories has been "set aside to next year" reports UDN, with their legal person pointing out that due to multiple new customer orders from major semiconductor equipment manufacturers has pushed the order visibility into Q2 2025, meaning that performance of the markets is expected to be strong throughout this year, and into 2025 easily.
SK hynix, Samsung, and Micron have all started expanding their respective HBM production capacity to keep up with strong order demand from HPC factories while at the same time placing large equipment orders with US company Applied Materials, a supplier of etching and deposition equipment.
- Read more: SK hynix speeds up HBM development: HBM4 in 2025 and HBM4E now coming in 2026
- Read more: Micron expands HBM production with new lines in the US, also considers Malaysia HBM production
- Read more: SK hynix says its ultra-next-gen HBM4E in 2026, ready for the world of next-gen AI GPUs
- Read more: NVIDIA's next-gen R100 AI GPU: TSMC 3nm with CoWoS-L packaging, HBM4 in Q4 2025
- Read more: SK hynix says most of its HBM for 2025 is sold out already, 16-Hi HBM4 coming in 2028
- Read more: SK hynix and TSMC to work together with HBM4, next-gen semiconductor packaging tech
- Read more: Samsung and SK hynix to use new 1c DRAM on next-gen HBM4 memory for future-gen AI GPUs
Applied Materials will reportedly soon begin expanding its outsourcing of related equipment OEMs to Jingding in Q2 2025, with UDN reporting that Jingding's order-taking momentum already set its sights on the second quarter of next year.
NVIDIA's current Hopper H100 AI GPU uses HBM3 memory, its beefed-up H200 AI GPU rocks the ultra-fast HBM3E memory, while its upcoming Blackwell B100 and B200 AI GPUs use HBM3E memory. However, NVIDIA has already teased its next-generation Rubin R100 AI GPU that will roll out with future-gen HBM4 memory and we'll be seeing that in 2025.
HBM4 offers even more performance with massive power savings compared to HBM3 and HBM3E memory, something that SK hynix, Samsung, and Micron are full-steam ahead on right now. SK hynix is the HBM leader right now, and there seems to be nothing that will change that over the next couple of years, especially with HBM4 in the works.
We should expect some big things from Rubin R100, because Blackwell B100 and B200 are already absolute AI GPU monsters compared to Hopper H100 and H200. HBM4 is going to be a huge driving force behind that, and we can't wait to see NVIDIA unleash its monster Rubin R100 AI GPU in 2025.