SK hynix, Samsung, Micron are expanding HBM production, order strength to last throughout 2025

SK hynix, Samsung, and Micron are sending US equipment vendor Applied Materials to boost HBM production, 'order strength' seen lasting throughout 2025.

3 minutes & 14 seconds read time

HBM is the key to AI chips of today and more especially, the future, with HBM production capacity at its limits as AI chip makers like NVIDIA, have been scooping it all up.

SK hynix, Samsung, Micron are expanding HBM production, order strength to last throughout 2025 906

SK hynix, Samsung, and Micron lead the HBM memory chip side, with HBM3 and HBM3E orders from all three major DRAM manufacturers completely out of stock until the end of 2025. We've been reporting on that side of the industry heavily, but now all three HBM manufacturers -- SK hynix, Samsung, and Micron -- are expanding HBM production capacity to keep up with the continuously growing demand of the AI market.

This means that the demand for application materials outsourcing equipment and components from the three leading DRAM factories has been "set aside to next year" reports UDN, with their legal person pointing out that due to multiple new customer orders from major semiconductor equipment manufacturers has pushed the order visibility into Q2 2025, meaning that performance of the markets is expected to be strong throughout this year, and into 2025 easily.

SK hynix, Samsung, and Micron have all started expanding their respective HBM production capacity to keep up with strong order demand from HPC factories while at the same time placing large equipment orders with US company Applied Materials, a supplier of etching and deposition equipment.

Applied Materials will reportedly soon begin expanding its outsourcing of related equipment OEMs to Jingding in Q2 2025, with UDN reporting that Jingding's order-taking momentum already set its sights on the second quarter of next year.

NVIDIA's current Hopper H100 AI GPU uses HBM3 memory, its beefed-up H200 AI GPU rocks the ultra-fast HBM3E memory, while its upcoming Blackwell B100 and B200 AI GPUs use HBM3E memory. However, NVIDIA has already teased its next-generation Rubin R100 AI GPU that will roll out with future-gen HBM4 memory and we'll be seeing that in 2025.

HBM4 offers even more performance with massive power savings compared to HBM3 and HBM3E memory, something that SK hynix, Samsung, and Micron are full-steam ahead on right now. SK hynix is the HBM leader right now, and there seems to be nothing that will change that over the next couple of years, especially with HBM4 in the works.

We should expect some big things from Rubin R100, because Blackwell B100 and B200 are already absolute AI GPU monsters compared to Hopper H100 and H200. HBM4 is going to be a huge driving force behind that, and we can't wait to see NVIDIA unleash its monster Rubin R100 AI GPU in 2025.

Buy at Amazon

NVIDIA H100 80 GB Graphic Card PCIe HBM2e Memory 350W (NVIDIA H100 80 GB)

TodayYesterday7 days ago30 days ago
Buy at Newegg
* Prices last scanned on 7/18/2024 at 12:39 am CDT - prices may not be accurate, click links above for the latest price. We may earn an affiliate commission.

Anthony joined the TweakTown team in 2010 and has since reviewed 100s of graphics cards. Anthony is a long time PC enthusiast with a passion of hate for games built around consoles. FPS gaming since the pre-Quake days, where you were insulted if you used a mouse to aim, he has been addicted to gaming and hardware ever since. Working in IT retail for 10 years gave him great experience with custom-built PCs. His addiction to GPU tech is unwavering and has recently taken a keen interest in artificial intelligence (AI) hardware.

Newsletter Subscription

Related Tags