Micron samples 12-Hi HBM3E with up to 36GB capacity: 9.2Gbps speeds, 1.2TB/sec memory bandwidth

Micron has started sampling its 'production-ready' HBM3E memory chips, with up to 36GB capacities in a 12-Hi HBM3E design, with up to 1.2TB/sec bandwidth.

Micron samples 12-Hi HBM3E with up to 36GB capacity: 9.2Gbps speeds, 1.2TB/sec memory bandwidth
Comment IconFacebook IconX IconReddit Icon
Gaming Editor
Published
2 minutes & 15 seconds read time

Micron has just announced its shipping production-capable HBM3E 12-Hi memory in up to 36GB capacities, pushing 1.2TB/sec of memory bandwidth ready for AI GPUs.

Micron samples 12-Hi HBM3E with up to 36GB capacity: 9.2Gbps speeds, 1.2TB/sec memory bandwidth 203

The new Micron HBM3E 12-Hi features an impressive 36GB capacity, which is a 50% increase over current HBM3E 8-Hi stacks, allowing far larger AI models like Llama 2 with 70 billion parameters to run on a single AI processor. This increased capacity to 36GB allows faster time to insight by avoiding CPU offload and GPU-GPU communication delays.,

Micron's new HBM3E 12-Hi 36GB delivers "significantly" lower power consumption than competitors' HBM3E 8-Hi 24GB memory, with Micron's new HBM3E 12-Hi memory pushing over 1.2TB/sec of memory bandwidth at a pin speed of over 9.2Gbps. These combined benefits of Micron's new HBM3E memory modules over maximum throughput with the lowest power consumption, ensuring optimal outcomes for power-hungry data centers of the future.

The company also adds that its new HBM3E 12-Hi memory features fully programmable MBIST that can run system representative traffic at full-spec speed, providing improved test coverage for expedited validation enabling faster time-to-market (TTM), and enhancing system reliability.

Micron samples 12-Hi HBM3E with up to 36GB capacity: 9.2Gbps speeds, 1.2TB/sec memory bandwidth 201

In summary, here are the Micron HBM3E 12-high 36GB highlights:

  • Undergoing multiple customer qualifications: Micron is shipping production-capable 12-high units to key industry partners to enable qualifications across the AI ecosystem.
  • Seamless scalability: With 36GB of capacity (a 50% increase in capacity over current HBM3E offerings), HBM3E 12-high allows data centers to scale their increasing AI workloads seamlessly.
  • Exceptional efficiency: Micron HBM3E 12-high 36GB delivers significantly lower power consumption than the competitive HBM3E 8-high 24GB solution!
  • Superior performance: With pin speed greater than 9.2 gigabits per second (Gb/s), HBM3E 12-high 36GB delivers more than 1.2 TB/s of memory bandwidth, enabling lightning-fast data access for AI accelerators, supercomputers, and data centers.
  • Expedited validation: Fully programmable MBIST capabilities can run at speeds representative of system traffic, providing improved test coverage for expedited validation, enabling faster time to market, and enhancing system reliability.
Photo of the NVIDIA H100 Hopper PCIe 80GB Graphics Card
Best Deals: NVIDIA H100 Hopper PCIe 80GB Graphics Card
Country flag Today 7 days ago 30 days ago
$27988 USD $27988 USD
Buy
$39879 USD -
Buy
$27988 USD $27988 USD
Buy
$27988 USD $27988 USD
Buy
$27988 USD $27988 USD
Buy
* Prices last scanned on 3/5/2025 at 8:32 pm CST - prices may not be accurate, click links above for the latest price. We may earn an affiliate commission from any sales.
NEWS SOURCE:wccftech.com

Gaming Editor

Email IconX IconLinkedIn Icon

Anthony joined the TweakTown team in 2010 and has since reviewed 100s of graphics cards. Anthony is a long time PC enthusiast with a passion of hate for games built around consoles. FPS gaming since the pre-Quake days, where you were insulted if you used a mouse to aim, he has been addicted to gaming and hardware ever since. Working in IT retail for 10 years gave him great experience with custom-built PCs. His addiction to GPU tech is unwavering and has recently taken a keen interest in artificial intelligence (AI) hardware.

Related Topics

Newsletter Subscription