Oracle has announced it's spending over $100 billion on 2000+ new data centers, expanding on the 160 data centers in operation, with NVIDIA getting 40% of that business for AI hardware. Not only that, but not one, not two, but three nuclear reactors could power the new Blackwell AI GPU supercluster.
NVIDIA and Oracle will be launching zettascale OCI superclusters with over 100,000 AI GPUs, with new infrastructure to accelerate AI training and deployment of generative AI models. NVIDIA GB200 liquid-cooled bare-metal instances for large-scale AI applications will be introduced, with Oracle to offer NVIDIA HGX H200 Tensor Core GPUs, connecting up to 65,536 AI GPUs for real-time inference.
Oracle CEO said: "So we're in the middle of designing a data center that's north of the gigawatt that has -- but we found the location and the power place we look at it, they've already got building permits for 3 nuclear reactors. These are the small modular nuclear reactors to power the data center. This is how crazy it's getting. This is what's going on".
The company explains: "So that goes on, and we'll see more and more applications look at that. So I wouldn't -- if your horizon is over the next 5 years, maybe even the next 10 years, I wouldn't worry about, hey, we've now trained all the models we need and all we need to do is inferencing. I think this is an ongoing battle for technical supremacy that will be fought by a handful of companies and maybe one nation state over the next 5 years at least, but probably more like 10. So this business is just growing larger and larger and larger. There's no slowdown or shift coming".
- Read more: Elon Musk's new Memphis Supercluster uses gigantic portable power generators, grid isn't enough
Oracle CEO Larry Elison talked about AI market growth, where he said: "I mean these AI models, these frontier models are going to -- the entry price for a real frontier model from someone who wants to compete in that area is about $100 billion. Let me repeat, around $100 billion. That's over the next 4, 5 years for anyone who wants to play in that game. That's a lot of money. And it doesn't get easier. So there are not going to be a lot of those. I mean we -- this is not the place the list who can actually build one of these frontier models".
"But in addition to that, there are going to be a lot of very, very specialized models. I can tell you things that I'm personally involved in, which are using computers to look at, biopsies of slides or CAT scans to discover cancer. Also, there are also blood tests were for discovery and cancer. Those tend to be very specialized models. Those tend not necessarily use the foundational the rocks and the ChatGPTs, and the Gemini, they tend to be highly specialized models. Trained on image recognition on certain data, I mean, literally millions of biopsy slides, for example, and not much other training data is helpful".
He added: "So that goes on, and we'll see more and more applications look at that. So I wouldn't -- if your horizon is over the next 5 years, maybe even the next 10 years, I wouldn't worry about, hey, we've now trained all the models we need and all we need to do is inferencing. I think this is an ongoing battle for technical supremacy that will be fought by a handful of companies and maybe one nation state over the next 5 years at least, but probably more like 10. So this business is just growing larger and larger and larger. There's no slowdown or shift coming".