Morgan Stanley has said that the global supply chain will see a huge boost in orders from the industry for AI servers, with NVIDIA continuing (and clawing even more) market share to dominate the AI industry further.
NVIDIA is expected to ship between 60,000 and 70,000 units of its Blackwell GB200 AI servers, which will bring in $210 billion. Each server costs $2 million to $3 million, so 70,000 GB 200 AI servers at $3 million each = $210 billion. NVIDIA is making the NVL72 and NVL36 GB200 AI servers, as well as B100 and B200 AI GPUs on their own.
Morgan Stanley estimates that if NVIDIA's new NVL36 AI cabinet is used as the biggest seller by quantity, the overall demand for GB200 in 2025 will increase to 60,000 to 70,000 units. The big US cloud companies are the customers lining up -- or have already purchased -- NVL36 AI cabinets, but by 2025 we could expect NVL72 AI cabinets to be shipping in higher quantities than NVL36.
On top of that, B200 HGX could start selling more units earlier than expected, as soon as Q4 2024, because it is easier to upgrade", reports UDN. The report also points out that it was recently reported that NVIDIA's new GB200 Bianca computing board has an overheating problem, which could cause shipping delays for GB200 Bianca NVL36 server rack.
- Read more: NVIDIA places fresh new orders with TSMC for more Blackwell GB200, B100, B200 AI chips
- Read more: NVIDIA is using Foxconn as the sole supplier of NVLink switches for next-gen GB200 AI servers
- Read more: NVIDIA's new GB200 AI servers led by Foxconn with 40% and Quanta with 30%: ships in Q3 2024
- Read more: NVIDIA's next-gen GB200 AI server cabinets to ship in 'small quantities' in Q4 2024
- Read more: NVIDIA's new GB200 Superchip costs up to $70,000: full B200 NVL72 AI server costs $3 million
Foxconn, which is where Bianca has its main assembly plant, didn't comment on the overheating reports, reiterating that the GB200 AI server rack system wouldn't start shipping until the end of Q3 2024, and that it may be one of the supply chain issues before mass production, but it's expected to be all smoothed over in the next 2-3 months.
Morgan Stanley is also expecting CoWoS production capacity at TSMC to reach 30,000 to 35,000 pieces per month, which means that production capacity could increase to 60,000 to 70,000 pieces per month by the end of 2025, which is far higher than the 50,000 pieces per month originally predicted by the market researcher.
- Read more: NVIDIA DGX GB200 AI servers expected to sell 40,000 servers in 2025
- Read more: NVIDIA's next-gen GB200 AI server chips go into mass production in September
- Read more: Quanta to make NVIDIA GB200-based AI servers for Google, Amazon, and Meta
- Read more: NVIDIA GB200 Grace Blackwell Superchip: 864GB HBM3E, 16TB/sec bandwidth
- Read more: NVIDIA's full-spec B200 AI GPU uses 1200W of power, Hopper H100 uses 700W
TSMC is pushing out as much CoWoS advanced packaging capacity as it can right now, with NVIDIA expected to purchase 340,000 pieces of CoWoS production capacity from TSMC, and the remaining 40,000 pieces from other foundries for its Blackwell AI GPUs.