Regarding the AI boom, we often think about GPUs, data center CPUs, and products from companies like NVIDIA. Well, specifically NVIDIA, whose H100 and upcoming Blackwell B200 GPUs and Superchips are at the heart of the generative AI movement. To the point where AI-related GPU revenues have made NVIDIA one of the biggest companies in the world.
But there's more to AI than Tensor Cores and high-end silicon processors from Team Green; high-speed memory and storage are also essential components for assembling massive data centers and supercomputers that train complex models like Meta's recent Llama 3.1.
So, it shouldn't come as a surprise that the DRAM and NAND Flash industry is expected to see "significant increases" in revenue this year. According to market analysis and reporting, DRAM and NAND Flash revenue for 2024, which includes memory and storage, will see increases of 75% and 77%, respectively.
Money-wise, DRAM revenue is projected to reach $90.7 billion in 2024 and increase again by 51% in 2025 to $136.5 billion. Part of this comes from the demand for High-Bandwidth Memory or HBM, increased DRAM prices, and the evolution of more complex (see: more expensive) DRAM products like DDR5 memory. For example, even though HBM is projected to reach 5% of total shipments this year, it will account for 20% of all revenue.
NAND Flash revenue covers SSD storage devices, which are expected to deliver $67.4 billion in 2024 and $87 billion in 2025. This will be driven by QLC enterprise SSDs used in AI servers, workstations, and data centers. Of course, other areas outside of AI will contribute to NAND flash growth, with one of these being Apple's plans to implement QLC storage into its iPhones by 2026.
There is an upside and downside to all of this, with the upside being more cash flow to invest in new technologies and practices. The downside is that the increased demand will also put a strain on the supply of the raw materials required to manufacture all of this tech.
- Read more: SK hynix exports of SSDs rose 84% in 1H 2024: big increases for AI data centers
- Read more: Micron intros 9550 Gen5 SSD: the world's fastest data center SSD, 14GB/sec for AI workloads
- Read more: HighPoint teases industry's first PCIe 5.0 x16 NVMe storage: up to 2PB capacity, 60GB/sec reads
- Read more: SSDs with 1000-layer memory chips expected in 2027: ultra-fast 20TB NVMe drives for $250