NVIDIA and AMD's best AI GPUs use HBM3 memory, but with the introduction of the H200 AI GPU, NVIDIA will be the first to market with an HBM3E-based AI GPU.
HBM3E will be found inside of NVIDIA's upcoming H200 and next-gen B100 AI GPUs, with TrendForce noting that the supply bottleneck through advanced CoWoS packaging technology and the long production cycle of HBM extend the timeline from wafer initiation to the final production past 6 months.
NVIDIA's current H100 AI GPU uses HBM3 memory primarily supplied by SK hynix, which has caused stock worldwide issues due to the crazy-high demand for AI GPUs. Samsung's entry into NVIDIA's supply chain with its new HBM3 memory in late 2023, were "initially minor, signifies its breakthrough in this segment," reports TrendForce.
The report by TrendForce reads: "Samsung's stride continued as its HBM3 offerings received AMD MI300 series certification by 1Q24, enhancing its standing as a crucial supplier to AMD. This milestone paves the way for an increase in Samsung's HBM3 production distribution starting from 1Q24. It's worth noting that Micron has not yet entered the HBM supply market, leaving SK hynix and Samsung as key players. Samsung, in particular, is poised to rapidly gain market share with AMD's MI300 series distribution scaling up over the subsequent quarters".
- Read more: Samsung to use MR-MUF tech, like SK hynix, for its future-gen HBM products
- Read more: Samsung to unveil world's first HBM3e 12-High memory at NVIDIA GTC 2024
The report continues that starting this year, a huge shift from HBM3 to HBM3E will occur, positioning HBM3E memory as the new mainstream in the HBM market. TrendForce reports that SK hynix leads the way with its HBM3E validation in Q1 2024, closely followed by Micron, which will have its HBM3E memory towards the end of Q1 2024, ready for NVIDIA's new H200 AI GPU, which will drop by the end of Q2 2024.
- Read more: AMD Instinct MI300X: new AI GPU with 192GB of HBM3 at 5.3TB/sec bandwidth
- Read more: NVIDIA H200 AI GPU: up to 141GB of HBM3e memory with 4.8TB/sec bandwidth
Samsung is "slightly behind in sample submissions" but expected to complete its HBM3E validation by the end of Q1 2024, with shipments of its new HBM3E memory in Q2 2024. Samsung has "already made significant strides in HBM3, and its HBM3E validation is expected to be completed soon; the company is poised to significantly narrow the market share gap with SK hynix by the end of the year, reshaping the competitive dynamics in the HBM market".