NVIDIA and OpenAI are reportedly aiming to create a technological marvel by combining the power of more than a million NVIDIA AI GPUs and software developed by OpenAI that will link them all together.

For those that don't know, AI systems such as open AI's ChatGPT are powered by thousands of NVIDIA AI GPUs. According to reports, NVIDIA has supplied approximately 20,000 of its AI GPUs to open AI, but this won't be enough if the company wishes to keep up with its ever-expanding language models that require more and more power.
According to Wang Shah Kwong, a businessman and founder of the Chinese search engine Sogou, OpenAI is already developing a more advanced AI computing model that will have a total capacity of 10 million AI GPUs. While 10 million NVIDIA AI GPUS sounds astronomical (and it is), it's another thing entirely to actually reach that number in physical hardware.
According to reports, NVIDIA is only able to produce one million of its AI GPUs every year, which means that at its current rate, it will take 10 years to achieve the total capacity of OpenAIs new advanced AI model.
However, if NVIDIA is able to produce even a fraction of what OpenAI needs, the sheer computing power will be hard to fathom, especially considering NVIDIA's technology that will be able to connect each of the GPUs together - forming an eye-watering amount of computing power.
In other artificial intelligence news, the developers behind chat GPT have pulled the AI classifier from the service. For more information on this story, check out the link below.