Micron HBM3E for NVIDIA's beefed-up H200 AI GPU has shocked HBM competitors like SK hynix

Micron partnering with NVIDIA for its upcoming H200 AI GPU with upgraded HBM3E memory has HBM competitors like SK hynix and other 'shocked'.

Published
Updated
2 minutes & 41 seconds read time

Micron was the first to announce mass production of its new ultra-fast HBM3E memory in February 2024, seeing the company ahead of HBM rivals in SK hynix and Samsung... leaving its HBM competitors shocked.

Micron HBM3E for NVIDIA's beefed-up H200 AI GPU has shocked HBM competitors like SK hynix 1304

The US memory company announced it would provide HBM3E memory chips for NVIDIA's upcoming beefed-up H200 AI GPU, which will feature HBM3E memory, unlike its predecessor with the H100 AI GPU, which featured HBM3 memory.

Micron will make its new HBM3E memory chips on its 1b nanometer DRAM chips, comparable to 12nm nodes, which HBM leader SK Hynix is using on its HBM. According to Korea JoongAng Daily, Micron is "technically ahead" of HBM competitor Samsung, which is still using 1a nanometer technology, which is the equivalent of 14nm technology.

Micron's new 24GB 8-Hi HBM3E memory will be the heart of NVIDIA's upcoming H200 AI GPU, with Micron's superiority in the HBM process being a key part of its agreement with NVIDIA. Micron explains its new HBM3E memory:

  • Superior Performance: With pin speed greater than 9.2 gigabits per second (Gb/s), Micron's HBM3E delivers more than 1.2 terabytes per second (TB/s) of memory bandwidth, enabling lightning-fast data access for AI accelerators, supercomputers, and data centers.
  • Exceptional Efficiency: HBM3E leads the industry with ~30% lower power consumption compared to competitive offerings. To support increasing demand and usage of AI, HBM3E offers maximum throughput with the lowest levels of power consumption to improve important data center operational expense metrics.
  • Seamless Scalability: With 24 GB of capacity today, HBM3E allows data centers to seamlessly scale their AI applications. Whether for training massive neural networks or accelerating inferencing tasks, Micron's solution provides the necessary memory bandwidth.

Chip expert Jeon In-seong, and author of "The Future of the Semiconductor Empire," said: "It's proven that Micron's manufacturing method is more advanced than Samsung Electronics because their HBM3E will be made with 1b nanometer technology. Micron will need some more work in the packaging side, but that should be easier than what they've already achieved with 1b nanometer technology".

Ko Young-min, an analyst at Daol Investment & Securities, said in a new note: "It will remain ahead of rivals in terms of client credibility and profitability. Even if Micron attempts at the MR-MUF manufacturing technology, it will take some time to catch up with SK hynix considering the time and effort needed".

Anthony joined the TweakTown team in 2010 and has since reviewed 100s of graphics cards. Anthony is a long time PC enthusiast with a passion of hate for games built around consoles. FPS gaming since the pre-Quake days, where you were insulted if you used a mouse to aim, he has been addicted to gaming and hardware ever since. Working in IT retail for 10 years gave him great experience with custom-built PCs. His addiction to GPU tech is unwavering and has recently taken a keen interest in artificial intelligence (AI) hardware.

Newsletter Subscription

Related Tags