JEDEC chills on next-gen HBM4 thickness: 16-Hi stacks with current bonding tech allowed

JEDEC has reportedly eased HBM4 memory configurations to help HBM manufacturers, reducing thickness of HBM4 to 775 micrometers for 12-layer, 16-layer HBM4.

1 minute & 53 seconds read time

HBM3E memory is about to be unleashed with NVIDIA's upcoming beefed-up H200 AI GPU, but now JEDEC has reportedly relaxed the rules for HBM4 memory configurations.

JEDEC chills on next-gen HBM4 thickness: 16-Hi stacks with current bonding tech allowed 902

JEDEC has reportedly reduced the package thickness of HBM4 down to 775 micrometers for both 12-layer and 16-layer HBM4 stacks, as it gets more complex at higher thickness levels, making it easier... especially as HBM makers fly in the face of insatiable demand for AI GPUs (now, and into the future with HBM4-powered chips).

HBM manufacturers, including SK hynix, Micron, and Samsung, were poised to use hybrid bonding with the process, a newer packaging technology, and more to reduce the package thickness of HBM4, which uses direct bonding with the onboard chip and wafer. However, HBM4, being a new technology, sees that hybrid bonding would increase pricing, making HBM4-powered AI GPUs of the future even more expensive.

In a new report from ZDNet Korea, sources said that HBM manufacturers would use the "relaxation" pathed out by JEDEC. We know that SK hynix will be mass-producing next-generation HBM4 memory in 2026, with the first batch of samples expected to boast 36GB per stack.

SK hynix, Micron, and Samsung are participants in the evolution of HBM4 memory technology, as well as NVIDIA, AMD, and Intel are "participating in the consultation," reports ZDNet Korea. These companies "failed to produce results in the first and second negotiations" which is because "participating companies have expressed opposition to relaxing the HBM4 standard to 775 micrometers".

However, during the recent third consultation, it was agreed upon that 775 micrometers for both 12-layer stacked HBM4 and 16-layer stacked HBM4. This is thanks to memory companies actively asserting that maintaining the current 720-micrometer thickness has "reached its limit." NVIDIA, AMD, and other are "also said to have positively accepted the proposal in order to smoothly receive HBM from the memory companies.

Buy at Amazon

NVIDIA H100 80 GB Graphic Card PCIe HBM2e Memory 350W

TodayYesterday7 days ago30 days ago
Buy at Newegg
* Prices last scanned on 4/20/2024 at 4:38 am CDT - prices may not be accurate, click links above for the latest price. We may earn an affiliate commission.

Anthony joined the TweakTown team in 2010 and has since reviewed 100s of graphics cards. Anthony is a long time PC enthusiast with a passion of hate for games built around consoles. FPS gaming since the pre-Quake days, where you were insulted if you used a mouse to aim, he has been addicted to gaming and hardware ever since. Working in IT retail for 10 years gave him great experience with custom-built PCs. His addiction to GPU tech is unwavering and has recently taken a keen interest in artificial intelligence (AI) hardware.

Newsletter Subscription

Related Tags