Samsung forms dedicated 'HBM team' to boost AI memory chip production to beat SK hynix

Samsung forms dedicated HBM team for next-gen AI chip production yields, as it makes HBM4 memory and next-gen AI Mach-1 accelerator.

Published
Updated
2 minutes & 38 seconds read time

Samsung has set up a dedicated HBM (High Bandwidth Memory) team inside its memory chip division. The new HBM team will increase production yields as the South Korean giant continues developing its sixth-generation AI memory, HBM4, and its new Mach-1 AI accelerator.

Samsung forms dedicated 'HBM team' to boost AI memory chip production to beat SK hynix 9014

In a new report from KED Global, we're hearing about the new HBM team that's in charge of the development and sales of DRAM and NAND flash memory "according to industry sources." Hwang Sang-joon, corporate executive vice president and head of DRAM Product and Technology at Samsung, will lead the new HBM team.

Kyung Kye-hyun, head of Samsung's semiconductor business, said in a note posted on social media: "Customers who want to develop customized HBM4 will work with us. HBM leadership is coming to us thanks to the dedicated team's efforts".

South Korean rival SK hynix has been dominating the HBM memory market, which hasn't been going so well for Samsung, as the company disbanded its then-HBM team, concluding at the time that the HBM market wasn't going to grow significantly. Oh, how wrong Samsung was. Now, major course corrections are happening.

Samsung will begin a "two-track" strategy of simultaneously developing two types of cutting-edge memory chips: HBM and Mach-1. Samsung plans to have its new HBM3E memory in mass production in the second half of 2024, with next-gen HBM4 memory planned for 2025.

Samsung forms dedicated 'HBM team' to boost AI memory chip production to beat SK hynix 9015

At Memcon 2024, chipmakers from around the world gathered in San Jose, California last week, with Hwang saying that Samsung expects to increase its HBM chip production by 2.9x this year compared to 2023.

As for Samsung's new Mach-1, it's a new system-on-chip (SoC) that reduces the bottleneck between GPUs and HBM memory chips, while its next-generation model of the inference-committed AI accelerator -- Mach-2 -- is in development. Kyung said on Friday: "We need to accelerate the development of Mach-2, for which clients are showing strong interest".

Buy at Amazon

NVIDIA H100 80 GB Graphic Card PCIe HBM2e Memory 350W

TodayYesterday7 days ago30 days ago
Buy at Newegg
$139.99$139.99$139.99
$30099.99$30099.99$28589.95
* Prices last scanned on 4/12/2024 at 4:44 am CDT - prices may not be accurate, click links above for the latest price. We may earn an affiliate commission.
NEWS SOURCE:kedglobal.com

Anthony joined the TweakTown team in 2010 and has since reviewed 100s of graphics cards. Anthony is a long time PC enthusiast with a passion of hate for games built around consoles. FPS gaming since the pre-Quake days, where you were insulted if you used a mouse to aim, he has been addicted to gaming and hardware ever since. Working in IT retail for 10 years gave him great experience with custom-built PCs. His addiction to GPU tech is unwavering and has recently taken a keen interest in artificial intelligence (AI) hardware.

Newsletter Subscription

Related Tags