AMD's next-gen Instinct MI450X 'forced' NVIDIA to increase TGP, memory bandwidth on Rubin GPUs

AMD's next-gen Instinct MI450X AI accelerator has reportedly forced NVIDIA to make changes to its next-gen VR200 Rubin AI GPUs with more power, bandwidth.

AMD's next-gen Instinct MI450X 'forced' NVIDIA to increase TGP, memory bandwidth on Rubin GPUs
Comment IconFacebook IconX IconReddit Icon
Gaming Editor
Published
1 minute & 45 seconds read time
TL;DR: AMD's next-gen Instinct MI450X AI accelerator has pushed NVIDIA to enhance its Rubin VR200 AI GPU, increasing memory bandwidth to 20TB/sec and power to 2300W TGP. Both companies are advancing high-performance AI GPUs with massive HBM4 memory, intensifying competition in the 2026 AI accelerator market.

AMD's next-gen Instinct MI450X AI accelerator has reportedly "forced" NVIDIA to make changes to its Rubin VR200 AI GPU, according to the latest reports.

In a new post on X from SemiAnalysis, we're hearing rumors in order for NVIDIA's new Rubin AI GPUs to maintain a lead over AMD's upcoming Instinct MI450X series AI chips, VR200 Rubin had its HBM4 memory bandwidth increased to 20TB/sec per GPU (from 13GB/sec per GPU). Rubin went from 5TB/sec per GPU behind MI450X in memory bandwidth, to just ahead with 0.4GB/sec per GPU more bandwidth.

Not only that, VR200 Rubin was previously an 1800W TGP design, but two months ago it was bumped up to 2300W TGP, closer to Instinct MI450X which has a higher 2500W TGP. These new AI GPUs are thirsty... very, very thirsty.

We did hear rumors of NVIDIA's new Rubin AI GPUs being delayed last month (more on that in the story below), but the reports said that with Rubin taping out in June 2025 that NVIDIA was redesigning the chip to better match AMD's new Instinct MI450 AI accelerators. The reports said: "we think the Rubin chip will likely hit 2000W per chip, compared to 1800W when it was announced earlier".

AMD's new Instinct MI400 series AI accelerators will feature up to a massive 432GB of next-gen HBM4 memory, compared to NVIDIA's upcoming Rubin R100 with 384GB of HBM4, but it's beefier Rubin Ultra AI GPU offering will have 576GB of HBM4 memory. SK hynix, Samsung, and Micron would be loving all of the HBM4 used on next-gen AI chips, as there are only a handful of HBM manufacturers on the planet, and 2026 is truly the year of HBM(4).

Gaming Editor

Email IconX IconLinkedIn Icon

Anthony joined TweakTown in 2010 and has since reviewed 100s of tech products. Anthony is a long time PC enthusiast with a passion of hate for games built around consoles. FPS gaming since the pre-Quake days, where you were insulted if you used a mouse to aim, he has been addicted to gaming and hardware ever since. Working in IT retail for 10 years gave him great experience with custom-built PCs. His addiction to GPU tech is unwavering and has recently taken a keen interest in artificial intelligence (AI) hardware.

Follow TweakTown on Google News
Newsletter Subscription