Samsung has set up a dedicated HBM (High Bandwidth Memory) team inside its memory chip division. The new HBM team will increase production yields as the South Korean giant continues developing its sixth-generation AI memory, HBM4, and its new Mach-1 AI accelerator.

In a new report from KED Global, we're hearing about the new HBM team that's in charge of the development and sales of DRAM and NAND flash memory "according to industry sources." Hwang Sang-joon, corporate executive vice president and head of DRAM Product and Technology at Samsung, will lead the new HBM team.
Kyung Kye-hyun, head of Samsung's semiconductor business, said in a note posted on social media: "Customers who want to develop customized HBM4 will work with us. HBM leadership is coming to us thanks to the dedicated team's efforts".
South Korean rival SK hynix has been dominating the HBM memory market, which hasn't been going so well for Samsung, as the company disbanded its then-HBM team, concluding at the time that the HBM market wasn't going to grow significantly. Oh, how wrong Samsung was. Now, major course corrections are happening.
- Read more: NVIDIA qualifying Samsung's new HBM3E chips, use them for future B200 AI GPUs
- Read more: HBM supply growth estimated at 260% in 2024, 14% of DRAM industry
- Read more: SK hynix begins volume production of HBM3E, for NVIDIA Blackwell B200 AI GPU
- Read more: Samsung to use MR-MUF technology for its future-gen HBM products
- Read more: SK hynix investing a further $1B to lead in HBM memory for future-gen AI GPUs
- Read more: SK hynix VP wants to become 'total AI memory provider' for future-gen AI GPUs with HBM
Samsung will begin a "two-track" strategy of simultaneously developing two types of cutting-edge memory chips: HBM and Mach-1. Samsung plans to have its new HBM3E memory in mass production in the second half of 2024, with next-gen HBM4 memory planned for 2025.

At Memcon 2024, chipmakers from around the world gathered in San Jose, California last week, with Hwang saying that Samsung expects to increase its HBM chip production by 2.9x this year compared to 2023.
As for Samsung's new Mach-1, it's a new system-on-chip (SoC) that reduces the bottleneck between GPUs and HBM memory chips, while its next-generation model of the inference-committed AI accelerator -- Mach-2 -- is in development. Kyung said on Friday: "We need to accelerate the development of Mach-2, for which clients are showing strong interest".




