Alert
TT Show Episode 55 - Arrow Lake, GeForce RTX 5070, and Google's Pixel smartphone tracking

TSMC and Samsung co-developing a bufferless HBM4 memory chip, its first partnership in AI

Samsung is partnering with TSMC to jointly develop next-generation HBM4 memory, with the two companies co-developing bufferless HBM4 memory chips.

TSMC and Samsung co-developing a bufferless HBM4 memory chip, its first partnership in AI
Published
2 minutes & 30 seconds read time

Samsung has announced it is partnering with TSMC on co-developing bufferless HBM4 memory chips for future AI chips at the SEMICON Taiwan 2024 forum on Thursday.

TSMC and Samsung co-developing a bufferless HBM4 memory chip, its first partnership in AI 401

Samsung is the world's largest memory chipmaker, partnering with TSMC, the world's largest contract chip manufacturer, with the South Korean and Taiwan semiconductor giants working together on bufferless HBM4 memory in order to strengthen their positions in the constantly-evoling AI chip market.

Dan Kochpatcharin, the head of Ecosystem and Alliance Management at TSMC, said during SEMICON Taiwan 2024 that the two companies were developing a bufferless HBM4 chip. Samsung makes its own HBM4, with TSMC forming a "triangular alliance" with SK hynix and NVIDIA on future HBM and AI designs. SK hynix is second to Samsung (and also native to South Korea) but this new development between Samsung + TSMC is very interesting.

Samsung is the world's largest memory chipmaker, but it's the second-largest semiconductor company only behind TSMC, and now it's partnering with the company on the future of AI memory: HBM4. During SEMICON Taiwan 2024, corporate president and head of Samsung's memory business, Lee Jung-bae, said that "to maximize the performance of AI chips, customized HBM is the best choice. We are working with other foundry players to offer more than 20 customized solutions".

HBM4 is a little different to HBM3 and HBM3E: the manufacturing process for HBM4 differs to previous-gen HBM designs, as the logic die -- acting as the brain of the HBM chip -- will be produced by foundry companies, versus memory manufacturers normally handling that. Samsung is manufacturing its own logic dies for its in-house HBM4 memory, using its in-house 4nm process node.

Samsung wants to offer a turnkey service: from DRAM production to logic die production, and advanced packaging, the South Korean giant c an do it all. But, Samsung is thinking of using TSMC's advanced technologies, as there are customers who prefer TMSC-produced logic dies, according to KED sources.

Photo of the product for sale

NVIDIA H100 Hopper PCIe 80GB Graphics Card

TodayYesterday7 days ago30 days ago
$28029.99$28029.99$28019.99
-
--$41789.00
* Prices last scanned on 10/14/2024 at 2:11 am CDT - prices may not be accurate, click links above for the latest price. We may earn an affiliate commission from any sales.
NEWS SOURCE:kedglobal.com

Anthony joined the TweakTown team in 2010 and has since reviewed 100s of graphics cards. Anthony is a long time PC enthusiast with a passion of hate for games built around consoles. FPS gaming since the pre-Quake days, where you were insulted if you used a mouse to aim, he has been addicted to gaming and hardware ever since. Working in IT retail for 10 years gave him great experience with custom-built PCs. His addiction to GPU tech is unwavering and has recently taken a keen interest in artificial intelligence (AI) hardware.

Newsletter Subscription

Join the daily TweakTown Newsletter for a special insider look into new content and what is happening behind the scenes.

Related Topics

Newsletter Subscription