AMD's new Instinct MI300X is designed for generative AI and is a big play for the company to compete with NVIDIA's H100 and Grace Hopper in high-performance computing. And it's a beast, with the AMD Instinct MI300X being an APU with up to 24 'Zen 4' Cores alongside the new CDNA 3 GPU Architecture.
It uses AMD's groundbreaking chiplet design with 8 Chiplets and 8 Memory Stacks using 5nm and 6nm processes, too, and even supports up to 192 GB of HBM3 Memory. But it will also be one of the most power-hungry GPU releases we've seen, with footnotes on AMD's slides covering the new AI accelerator rated at 750W for a single decked-out module.
Comparatively speaking, the NVIDIA H100 SXM GPU goes up to 700W, already big news as a power-hungry data center chip. However, a 350W PCIe version with traditional air-cooling of NVIDIA's chip exists with AMD is set to offer a lower-specced MI300A.
With NVIDIA and AMD measuring power draw using different methods, this might not be a like-for-like comparison, but it looks like the AMD Instinct MI300X will suck a lot of juice when it's doing its AI thing. And with the AMD Instinct Platform set to feature eight MI300Xs, that's a potential 6KW.
The MI300X also means we're inching slightly closer to single GPUs consuming 1KW of power as new generations increase performance and efficiency and boost specs. Compared to the previous generation's AMD Instinct MI250X GPUs, which are based on the CDNA 2 GPU Architecture, you're looking at a jump from 560W to 750W for the new Instinct MI300X. This isn't a bad thing per se, as AMD is touting an 8X performance increase for AI workloads, which is impressive, and more power efficient than the previous gen when you factor that in.
Plus, because it features 192 GB of HBM3 Memory, you're looking at being able to do more with fewer GPUs, which could see a cost reduction in running AI models on the MI300X versus older hardware. Still, once you see the 750W number, you realize why so many companies are investing in new data center cooling and power delivery methods.