SK hynix announces next-gen HBM4 memory development kicks off in 2024

SK hynix confirms it will begin development of its next-generation HBM4 memory in 2024, will power the next-gen of AI GPUs and data centers.

Published
Updated
2 minutes & 29 seconds read time

SH hynix has published a new blog post, announcing that the company will kick off the development of its next-generation HBM4 high-bandwidth memory in 2024.

DRAM suppliers roadmap for HBM memory solutions (source: TrendForce)

DRAM suppliers roadmap for HBM memory solutions (source: TrendForce)

We have both Micron and Samsung also producing HBM4 memory in 2024, where the companies are expecting HBM4 memory to be ready for 2025-2026. SK hynix explained that it plans on commencing production of its next-gen HBM4 memory in 2024, whereas Senior Manager Kim Wang-soo explained that SK hynix will also be pushing out new HBM3 memory in 2024 with boosted speeds and larger capacities.

GSM team leader Kim Wang-soo explained: "With mass production and sales of HBM3E planned for next year, our market dominance will be maximized once again. As development of HBM4, the follow-up product, is also scheduled to begin in earnest, SK Hynix's HBM will enter a new phase next year. "It will be a year where we celebrate".

We should expect HBM4 memory to be powering next-gen AI GPUs of the future, with TrendForce sharing a roadmap of DRAM suppliers through to 2026 in the image above. The company expects the first HBM4 samples to have up to 36GB capacities per stack, with the full spec expected in the second half of 2024-2025. The first customer sampling and availability are pegged for 2026, so we have to wait for next-gen AI GPUs and data center designs to see the full HBM4 unleashed.

36GB stacks will see up to 288GB of HBM4 capacity, but higher capacity is planned... comparing this to HBM3e, which maxes out at 9.8Gbps, HBM4 should be far and above 10Gbps+. NVIDIA's next-gen B100 "Blackwell" AI GPU will be using HBM3e memory, so we should expect the next-gen GPU after that -- codenamed "Vera Rubin" -- to feature HBM4. NVIDIA could also surprise with an upgraded Blackwell GPU using HBM4.

NVIDIA's upcoming refreshed H200 AI GPU uses HBM3e memory, compared to the regular HBM3 memory on the current H100 AI GPU. NVIDIA is expected to use HBM3e memory on its upcoming B100 "Blackwell" AI GPU, so we could expect an upgraded B200 "Blackwell" AI GPU powered with faster HBM4 memory in 2025.

Super exciting stuff, that's for sure... HBM4 memory is going to provide a huge injection of performance over HBM3 and HBM3e memory in 2024 and beyond.

Buy at Amazon

NVIDIA H100 80 GB Graphic Card PCIe HBM2e (NVIDIA H100 80 GB)

TodayYesterday7 days ago30 days ago
Buy at Newegg
$299.00$299.00$267.99
$29949.95$29949.95$29949.95
* Prices last scanned on 5/19/2024 at 2:23 am CDT - prices may not be accurate, click links above for the latest price. We may earn an affiliate commission.
NEWS SOURCE:wccftech.com

Anthony joined the TweakTown team in 2010 and has since reviewed 100s of graphics cards. Anthony is a long time PC enthusiast with a passion of hate for games built around consoles. FPS gaming since the pre-Quake days, where you were insulted if you used a mouse to aim, he has been addicted to gaming and hardware ever since. Working in IT retail for 10 years gave him great experience with custom-built PCs. His addiction to GPU tech is unwavering and has recently taken a keen interest in artificial intelligence (AI) hardware.

Newsletter Subscription

Related Tags