TSMC and Samsung co-developing a bufferless HBM4 memory chip, its first partnership in AI

Samsung is partnering with TSMC to jointly develop next-generation HBM4 memory, with the two companies co-developing bufferless HBM4 memory chips.

TSMC and Samsung co-developing a bufferless HBM4 memory chip, its first partnership in AI
Comment IconFacebook IconX IconReddit Icon
Gaming Editor
Published
2 minutes & 30 seconds read time

Samsung has announced it is partnering with TSMC on co-developing bufferless HBM4 memory chips for future AI chips at the SEMICON Taiwan 2024 forum on Thursday.

TSMC and Samsung co-developing a bufferless HBM4 memory chip, its first partnership in AI 401

Samsung is the world's largest memory chipmaker, partnering with TSMC, the world's largest contract chip manufacturer, with the South Korean and Taiwan semiconductor giants working together on bufferless HBM4 memory in order to strengthen their positions in the constantly-evoling AI chip market.

Dan Kochpatcharin, the head of Ecosystem and Alliance Management at TSMC, said during SEMICON Taiwan 2024 that the two companies were developing a bufferless HBM4 chip. Samsung makes its own HBM4, with TSMC forming a "triangular alliance" with SK hynix and NVIDIA on future HBM and AI designs. SK hynix is second to Samsung (and also native to South Korea) but this new development between Samsung + TSMC is very interesting.

Samsung is the world's largest memory chipmaker, but it's the second-largest semiconductor company only behind TSMC, and now it's partnering with the company on the future of AI memory: HBM4. During SEMICON Taiwan 2024, corporate president and head of Samsung's memory business, Lee Jung-bae, said that "to maximize the performance of AI chips, customized HBM is the best choice. We are working with other foundry players to offer more than 20 customized solutions".

HBM4 is a little different to HBM3 and HBM3E: the manufacturing process for HBM4 differs to previous-gen HBM designs, as the logic die -- acting as the brain of the HBM chip -- will be produced by foundry companies, versus memory manufacturers normally handling that. Samsung is manufacturing its own logic dies for its in-house HBM4 memory, using its in-house 4nm process node.

Samsung wants to offer a turnkey service: from DRAM production to logic die production, and advanced packaging, the South Korean giant c an do it all. But, Samsung is thinking of using TSMC's advanced technologies, as there are customers who prefer TMSC-produced logic dies, according to KED sources.

Photo of the NVIDIA H100 Hopper PCIe 80GB Graphics Card
Best Deals: NVIDIA H100 Hopper PCIe 80GB Graphics Card
Country flag Today 7 days ago 30 days ago
$27988 USD $27988 USD
Buy
$39879 USD $39879 USD
Buy
$27988 USD $27988 USD
Buy
$27988 USD $27988 USD
Buy
$27988 USD $27988 USD
Buy
* Prices last scanned on 2/7/2025 at 2:16 pm CST - prices may not be accurate, click links above for the latest price. We may earn an affiliate commission from any sales.
NEWS SOURCE:kedglobal.com

Gaming Editor

Email IconX IconLinkedIn Icon

Anthony joined the TweakTown team in 2010 and has since reviewed 100s of graphics cards. Anthony is a long time PC enthusiast with a passion of hate for games built around consoles. FPS gaming since the pre-Quake days, where you were insulted if you used a mouse to aim, he has been addicted to gaming and hardware ever since. Working in IT retail for 10 years gave him great experience with custom-built PCs. His addiction to GPU tech is unwavering and has recently taken a keen interest in artificial intelligence (AI) hardware.

Related Topics

Newsletter Subscription