SK hynix has had an absolutely stellar last 12 months riding the ever-growing AI wave, with the South Korean memory giant expecting over $10 billion in revenue from HBM alone by the end of 2024.
The news is coming from South Korean outlet TheElec, which sums up that SK hynix has sold out of its 2024 supply of HBM memory, and is already close to selling out its 2025 supply of HBM. NVIDIA's current H100 uses HBM3, while its new H200 and next-gen B200 AI GPUs both use HBM3E memory, provided by SK hynix.
SK hynix is staying ahead of its HBM competitors in Samsung and Micron, with plans to provide samples of its new 12-stack HBM3E this month, with mass production of the new HBM memory chips expected in Q3 2024 according to SK hynix CEO Kwak Noh-jung at a press conference on Thursday.
There are concerns that we could see an oversupply of HBM hitting the market, but the SK hynix CEO was quick to stop that train of thought, reiterating that HBM is different to general purpose memory like GDDR and DDR memory, which is based on consumer demand. AI GPU demand isn't slowing down, and it won't slow down, as one of the key parts of it is ultra-fast memory... that SK hynix just happens to lead on.
- Read more: SK hynix says most of its HBM for 2025 is sold out already
- Read more: SK hynix plans $14.6B on new fab in SK to meet 'soaring demand' of HBM
SK hynix is setting up a new through silicon via (TSV) line at its M15 fab in Cheongju, while its next-gen M15X fab will make next-gen DRAM. We should expect SK hynix to make new HBM memory with the TSV process in the future, with TSVs being microscopic wires that are punched through DRAMs when they are being assembled into HBMs.