SK hynix investing a further $1 billion to lead in HBM memory for future-gen AI GPUs

SK hynix says the first 50 years of the semiconductor industry was about the front-end, while the next 50 years are about the back end aka packaging.

Published
Updated
3 minutes & 6 seconds read time

SK hynix is reportedly increasing its spending on advanced chip packaging, where it wants to maintain its leadership in AI development with High Bandwidth Memory, or HBM.

HBM market share between SK hynix, Micron, and Samsung (source: Bloomberg)

HBM market share between SK hynix, Micron, and Samsung (source: Bloomberg)

In a new report from Bloomberg, the South Korean giant is investing more than $1 billion in South Korea this year to expand and improve the final steps of its chip manufacturing technology. Lee Kang-Wook, who leads up packaging development at SK hynix -- and was a former Samsung engineer -- said during an interview recently: "The first 50 years of the semiconductor industry has been about the front-end. But the next 50 years is going to be all about the back-end," or packaging.

Lee specializes in advanced ways of combining and connecting semiconductors, which have been the bedrock of the AI industry for the last few years. The executive started out in 2000, earning his PhD in 3D integration technology for micro-systems from Japan's Tohoku University under Mitsumasa Koyanagi, who was the man responsible for inventing stacker capacitor DRAM used in smartphones.

In 2002, Lee joined as principal engineer at Samsung's memory division, where he led the development of Through-Silicon Via (TSV)-based 3D packaging technologies. This would later become the foundational work for HBM, as it's high-bandwidth memory that stacks chips on top of one another, connecting them with TSVs for faster, and much more energy-effiicient data processing.

NVIDIA is using SK hynix's latest HBM3 and HBM3E memory modules for its current, beefed-up, and next-gen AI GPUs, including the Hopper H100, Hopper H200 coming very soon (the first AI GPU with HBM3E memory) and its next-gen Blackwell B100 AI GPU.

SK hynix shares have increased 45% over the last 6 months alone, currently sitting at record highs riding the AI GPU boom, leading the memory giant to become the second most valuable company in South Korea, second to Samsung.

Sanjeev Rana, an analyst at CLSA Securities Korea, said: "SK Hynix's management had better insights into where this industry is headed and they were well prepared. When the opportunity came their way, they grabbed it with both hands". But as for Samsung, the analyst says "they were caught napping."

This announcement from SK hynix is on the heels of its HBM competitor -- Samsung -- which just announced its new 36GB HBM3E 12-Hi stack, which was on the heels of SK hynix's new 24GB HBM3E 8-Hi stack being in mass production announcement.

Buy at Amazon

NVIDIA H100 80 GB Graphic Card PCIe HBM2e Memory 350W

TodayYesterday7 days ago30 days ago
Buy at Newegg
$139.99$139.99$139.99
$29949.95$29949.95$30099.99
* Prices last scanned on 4/29/2024 at 10:44 pm CDT - prices may not be accurate, click links above for the latest price. We may earn an affiliate commission.
NEWS SOURCE:bnnbloomberg.ca

Anthony joined the TweakTown team in 2010 and has since reviewed 100s of graphics cards. Anthony is a long time PC enthusiast with a passion of hate for games built around consoles. FPS gaming since the pre-Quake days, where you were insulted if you used a mouse to aim, he has been addicted to gaming and hardware ever since. Working in IT retail for 10 years gave him great experience with custom-built PCs. His addiction to GPU tech is unwavering and has recently taken a keen interest in artificial intelligence (AI) hardware.

Newsletter Subscription

Related Tags