Micron announces HBM3e memory enters volume production, ready for NVIDIA's new H200 AI GPU

Micron begins volume production of its new HBM3E memory solution, and will ship as part of NVIDIA's beefed-up H200 AI GPU coming soon.

2 minutes & 14 seconds read time

Micron has just announced it has started volume production of its bleeding-edge HBM3e memory, with the company's HBM3e known good stack dies (KGSDs) shipping as part of NVIDIA's upcoming H200 AI GPU.

Micron announces HBM3e memory enters volume production, ready for NVIDIA's new H200 AI GPU 1007

NVIDIA's new beefed-up H200 AI GPU will feature up to 141GB of ultra-fast HBM3e memory from Micron, which will be from its mass-produced 24GB 8-Hi HBM3e memory, with data transfer rates of 9.2GT/s and a peak memory bandwidth of over 1.2TB/sec per GPU. This is a 44% increase in memory bandwidth over HBM3, which provides the extra AI grunt the H200 has over the H100 AI GPU and its regular HBM3 memory.

The 141GB of HBM3e memory on the NVIDIA H200 AI GPU will feature up to 4.8TB/sec of memory bandwidth, which is up from the 80GB of HBM3 memory and up to 3.35TB/sec of memory bandwidth on the H100 AI GPU.

Sumit Sadana, executive vice president and chief business officer at Micron Technology, said in the press release: "Micron is delivering a trifecta with this HBM3E milestone: time-to-market leadership, best-in-class industry performance, and a differentiated power efficiency profile".

Sadana continued: "AI workloads are heavily reliant on memory bandwidth and capacity, and Micron is very well-positioned to support the significant AI growth ahead through our industry-leading HBM3E and HBM4 roadmap, as well as our full portfolio of DRAM and NAND solutions for AI applications".

HBM3E: Fueling the AI Revolution

As the demand for AI continues to surge, the need for memory solutions to keep pace with expanded workloads is critical. Micron's HBM3E solution addresses this challenge head-on with:

  • Superior Performance: With pin speed greater than 9.2 gigabits per second (Gb/s), Micron's HBM3E delivers more than 1.2 terabytes per second (TB/s) of memory bandwidth, enabling lightning-fast data access for AI accelerators, supercomputers, and data centers.
  • Exceptional Efficiency: Micron's HBM3E leads the industry with ~30% lower power consumption compared to competitive offerings. To support increasing demand and usage of AI, HBM3E offers maximum throughput with the lowest levels of power consumption to improve important data center operational expense metrics.
  • Seamless Scalability: With 24 GB of capacity today, Micron's HBM3E allows data centers to seamlessly scale their AI applications. Whether for training massive neural networks or accelerating inferencing tasks, Micron's solution provides the necessary memory bandwidth.
Buy at Amazon

NVIDIA H100 80 GB Graphic Card PCIe HBM2e Memory 350W

TodayYesterday7 days ago30 days ago
Buy at Newegg
* Prices last scanned on 4/12/2024 at 4:44 am CDT - prices may not be accurate, click links above for the latest price. We may earn an affiliate commission.
NEWS SOURCE:anandtech.com

Anthony joined the TweakTown team in 2010 and has since reviewed 100s of graphics cards. Anthony is a long time PC enthusiast with a passion of hate for games built around consoles. FPS gaming since the pre-Quake days, where you were insulted if you used a mouse to aim, he has been addicted to gaming and hardware ever since. Working in IT retail for 10 years gave him great experience with custom-built PCs. His addiction to GPU tech is unwavering and has recently taken a keen interest in artificial intelligence (AI) hardware.

Newsletter Subscription

Related Tags