NVIDIA's shortage of Hopper H100 AI GPUs is improving, with the previous 4-month wait now turning into 8-12 weeks.
It was just a few months ago that we reported that NVIDIA AI GPU shipments had been "greatly accelerated," according to analysts, with waiting times of 8-11 months for AI GPU deliveries reduced to just 3-4 months. Now that 4-month wait, is a 2-3 month wait.
In a new report from TrendForce, Dell is reportedly capitalizing on AI, with Dell Taiwan's General Manager saying on April 9 that the company is experiencing stronger server orders and demand in the Taiwanese market. This surge is thanks to AI needs within Taiwan's own corporate sector.
NVIDIA has faced challenges with the complexity of its latest AI GPUs, as they require TSMC's advanced CoWoS packaging technology, which the Taiwanese contract chip manufacturer is quickly expanding its CoWoS capacity to keep up. But starting in 2024, lead times for NVIDIA's Hopper H100 AI GPU have been decreasing steadily, and now we're seeing it continue to speed up. From 11 months to 2-3 months now.
- Read more: NVIDIA AI GPU customers 'offloading' chips, selling hard-to-buy excess AI GPU hardware
- Read more: NVIDIA's supply chain has 'massive improvement' in AI GPU deliveries, quicker deliveries
- Read more: Elon Musk says training next-gen Grok 3 will require 100,000 NVIDIA H100 AI GPUs
We've even reported that some NVIDIA AI GPU customers are offloading the expensive chips, and then we've seen AWS (Amazon Web Services) make it easier to rent out NVIDIA H100 AI GPUs through the cloud, which helps take some of the strain off of H100 AI GPUs being used elsewhere.
NVIDIA needs to get more and more H100 AI GPUs into the wild, but it also has its beefed-up H200 AI GPU with ultra-fast HBM3E right around the corner, and its next-generation Blackwell B200 AI GPU also just got unveiled, and will ramp up into production later this year. Elon Musk has recently said that training Grok 3 will require 100,000 x NVIDIA H100 AI GPUs, which is... uh, a lot.