AMD's next-gen Instinct MI400 GPU confirmed: rocks 432GB of HBM4 at 19.6TB/sec ready for 2026

AMD confirms its next-gen Instinct MI400 AI GPU has double the compute performance, a huge 432GB of next-gen HBM4 at 19.6TB/sec and will launch in 2026.

AMD's next-gen Instinct MI400 GPU confirmed: rocks 432GB of HBM4 at 19.6TB/sec ready for 2026
Comment IconFacebook IconX IconReddit Icon
Gaming Editor
Published
2-minute read time
TL;DR: AMD's upcoming Instinct MI400 AI accelerator, launching in 2026, will double AI compute performance over the MI350 series, featuring 432GB of next-gen HBM4 memory and 19.6TB/sec bandwidth. With up to 4 XCDs and enhanced interconnects, it aims to compete strongly with NVIDIA's Rubin R100 GPU.

AMD has just teased its next-gen Instinct MI400 AI accelerator, which will double the AI compute performance over the just-announced MI350 series, with 50% more memory, and close to 2.5x the memory bandwidth thanks to the use of next-gen HBM4 memory.

AMD's next-gen Instinct MI400 GPU confirmed: rocks 432GB of HBM4 at 19.6TB/sec ready for 2026 26

The company shared some fresh new details on its next-gen Instinct MI400 series AI accelerator, with 40 PFLOPs (FP4) and 20 PFLOPs (FP8) which is double the AI compute speeds of the new Instinct MI350 that was just launched today. AMD's new Instinct MI400 series AI chip will boast 50% more memory capacity over the MI350 which has 288GB of HBM3E, while the new MI400 has a huge 432GB of HBM4 memory.

AMD's use of the new HBM4 standard will bring the company up to full competitiveness against NVIDIA which will be using HBM4 on its upcoming Rubin R100 AI GPU, with AMD's new Instinct MI400 AI chip and its 432GB of HBM4 offering a huge 19.6TB/sec of memory bandwidth, up from just 8TB/sec on the MI350 series. The new AI GPU will also sport a 300GB/sec scale-out bandwidth per GPU, so we should expect big things from MI400 in 2026 as it battles Rubin R100 in the HBM4-powered AI fight.

AMD's new Instinct MI400 AI accelerator will feature up to 4 x XCDs (Accelerated Compute Dies) which is double the core counts of MI300 (2 x XCDs per AID). We should also expect two AIDs (Active Interposer Dies) on the Instinct MI400 series, as well as separate Multimedia and I/O dies as well.

For each of the AIDs there is a dedicated MID tile offering efficient communication between the compute units and I/O interfaces compared to what AMD has done with previous-gen Instinct AI accelerators.

AMD will launch its next-gen Instinct MI400 series AI accelerators in 2026 and we'll be keeping an eye on them until then.

News Source:wccftech.com

Gaming Editor

Email IconX IconLinkedIn Icon

Anthony joined TweakTown in 2010 and has since reviewed 100s of tech products. Anthony is a long time PC enthusiast with a passion of hate for games built around consoles. FPS gaming since the pre-Quake days, where you were insulted if you used a mouse to aim, he has been addicted to gaming and hardware ever since. Working in IT retail for 10 years gave him great experience with custom-built PCs. His addiction to GPU tech is unwavering and has recently taken a keen interest in artificial intelligence (AI) hardware.

Follow TweakTown on Google News
Newsletter Subscription