AI startup: AMD Instinct MI300X is 'far superior' option to NVIDIA Hopper H100 AI GPU

CEO of AI startup TensorWave says that AMD's new Instinct MI300X AI accelerator is a far better option for AI workloads than NVIDIA's H100 AI GPU.

Published
Updated
1 minute & 48 seconds read time

AI startup TensorWave is one of the first with a publically deployed setup powered by AMD Instinct MI300X AI accelerators, and its CEO says they're a far better option than NVIDIA's dominant Hopper H100 AI GPU.

TensorWave started racking up AI systems powered by AMD's just-released Instinct MI300X AI accelerator, which it plans to lease the MI300X chips at a fraction of the cost of NVIDIA's Hopper H100 AI GPU. TensorWave plans to have 20,000 of AMD's new Instinct MI300X AI accelerators before the end of the year across two facilities, and has plans to have liquid-cooled systems online in 2025.

Jeff Tatarchuk said that AMD's new Instinct MI300X AI GPU in "just raw specs, the MI300X dominates H100," and he's not wrong. But that's the original H100 with 40GB and 80GB HBM3 options, while a beefed-up H200 with a much larger 141GB of ultra-fast HBM3e memory with up to 4.8TB/sec of memory bandwidth, but once Blackwell B200 is here, NVIDIA owns the AI GPU game with 192TB of HBM3e at 8TB/sec memory bandwidth for the B200 AI GPU.

  • AMD Instinct MI300X: 192GB HBM3e @ up to 5.3TB/sec
  • NVIDIA H100: 80GB HBM3 @ up to 3.35TB/sec
  • NVIDIA H200: 141GB of HBM3e @ up to 4.8TB/sec
  • NVIDIA B200: 192GB of HBM3e @ up to 8TB/sec

AMD has the most VRAM on an AI GPU so far, with NVIDIA lagging behind with H100 and 80GB at its limits -- unless you're in China, with access to H100 96GB models -- and even the upcoming H200 with 141GB of HBM3e -- yeah, it's HBM3e and has more memory bandwidth, but it's not as much and not as fast as the Instinct MI300X from AMD.

But, it's not just pure hardware and VRAM for AI workloads, the actual AI accelerators or AI GPUs need to offer the same performance as NVIDIA's dominant H100 AI GPUs. Tatarchuk says that there's a lot of enthusiasm around AMD's new Instinct MI300X AI accelerators are a great alternative to NVIDIA, but customers aren't sure if they'll get the same performance.

He said: "there's also a lot of 'we're not 100% sure if it's going to be as great as what we're currently used to on NVIDIA,'" which is true.

Buy at Amazon

NVIDIA H100 80 GB Graphic Card PCIe HBM2e Memory 350W (NVIDIA H100 80 GB)

TodayYesterday7 days ago30 days ago
Buy at Newegg
$139.99$139.99$139.99
$29949.95$30099.99$30099.99
* Prices last scanned on 4/30/2024 at 9:45 pm CDT - prices may not be accurate, click links above for the latest price. We may earn an affiliate commission.

Anthony joined the TweakTown team in 2010 and has since reviewed 100s of graphics cards. Anthony is a long time PC enthusiast with a passion of hate for games built around consoles. FPS gaming since the pre-Quake days, where you were insulted if you used a mouse to aim, he has been addicted to gaming and hardware ever since. Working in IT retail for 10 years gave him great experience with custom-built PCs. His addiction to GPU tech is unwavering and has recently taken a keen interest in artificial intelligence (AI) hardware.

Newsletter Subscription

Related Tags