The next NVIDIA GPU shortage might arrive due to AI models like ChatGPT

ChatGPT made use of tens of thousands of NVIDIA GPUs to train its AI model, and with its popularity exploding it may require a lot more to keep up.

Published
Updated
1 minute & 54 seconds read time

The rising popularity of AI-based language and text tool ChatGPT has been one of the most talked about and covered tech stories of 2023. It's indicative of the boom in AI business and investment. AI tools like ChatGPT currently rely on AI processing - a key part of NVIDIA's portfolio, even in the consumer and enthusiast GPU space.

NVIDIA Ada Lovelace architecture

NVIDIA Ada Lovelace architecture

This is where GeForce RTX graphics cards, like the new GeForce RTX 4090 and RTX 4080, currently sit. NVIDIA's stock has recently seen massive gains in the region of 40% due to the increased demand for AI powerhouses like the Hopper H100 and Ampere A100 graphics cards.

The explosion of interest in ChatGPT, in particular, is an interesting case as it was trained on NVIDIA GPUs, with reports indicating that it took 10,000 cards to train the model we see today. And to keep up with the demand for the tool, it will need to scale rapidly. On this subject, Fierce Electronics questioned ChatGPT directly, asking if hardware from other vendors would factor into its growth, and it got the following response.

"It is possible that ChatGPT or other deep learning models could be trained or run on GPUs from other vendors in the future. However, currently, NVIDIA GPUs are widely used in the deep learning community due to their high performance and CUDA support. CUDA is a parallel computing platform and programming model developed by NVIDIA that allows for efficient computation on NVIDIA GPUs. Many deep learning libraries and frameworks, such as TensorFlow and PyTorch, have built-in support for CUDA and are optimized for NVIDIA GPUs."

With Microsoft integrating ChatGPT into its Bing search engine and Google looking to follow suit, Forbes added a "fun thought experiment" where this would require an additional 4,102,568 A100 GPUs to the tune of over USD 100 billion - a big chunk of which would go to NVIDIA. Of course, scaling means improvements to efficiency and processes, so it's not a 1:1 scenario requiring countless new GPUs.

Regardless, as pointed out by Wccftech, this could see NVIDIA prioritizing the AI GPU business shortly, and limited supply could lead to shortages in enthusiast and consumer-based GPU products from Big Green. Cards like the NVIDIA GeForce RTX 4090 could offer AI firms and researchers a low-cost solution, driving prices up with demand exceeding supply.

So, basically, the mining boom, except now it's AI.

Buy at Amazon

MSI Gaming GeForce RTX 4090 24GB GDRR6X

TodayYesterday7 days ago30 days ago
$1848.99$1899.99$2279.99
* Prices last scanned on 3/28/2024 at 11:41 pm CDT - prices may not be accurate, click links above for the latest price. We may earn an affiliate commission.

Kosta is a veteran gaming journalist that cut his teeth on well-respected Aussie publications like PC PowerPlay and HYPER back when articles were printed on paper. A lifelong gamer since the 8-bit Nintendo era, it was the CD-ROM-powered 90s that cemented his love for all things games and technology. From point-and-click adventure games to RTS games with full-motion video cut-scenes and FPS titles referred to as Doom clones. Genres he still loves to this day. Kosta is also a musician, releasing dreamy electronic jams under the name Kbit.

Newsletter Subscription

Related Tags