History Lesson on Fiji -- The Last 5 Years of Radeon
We've seen AMD move away from its head-first dive into the world of High Bandwidth Memory (HBM), which it moved from Fiji to Vega and used HBM2, but then from Vega to Navi we saw the company shift to GDDR6 memory. This has been a big deal for the company, along with the huge change from the GCN (GraphicsCore Next) architecture to the RDNA architecture that powers Navi (and the PlayStation 5 and Xbox Series X consoles).
I exclusively reported back in June 2018 that AMD would be powering the next-gen Sony PlayStation 5 console with a semi-custom design based on the Zen CPU and Navi GPU architectures. AMD since then, has really turned things around -- but a history lesson up until that point is below.
AMD has since gone from having its CPU range go from 'not really bothering with, Intel is so much more superior' to 'uh yeah, so which Ryzen CPU did you buy this time'. Oh how life has changed for AMD -- now commanding over 50% of premium CPU sales worldwide, and it's not just Ryzen -- it's Threadripper and EPYC knocking down door after door at Intel.
Radeon on the other hand? This is where the idea for re-looking at the Radeon R9 Fury X (my review here) came from. It has been nearly 5 years to the day that AMD released the Radeon R9 Fury X -- and the world has changed (in so many other ways, too). NVIDIA has been completely dominant since 2015 -- hell, even more so.
NVIDIA has gone from stride to stride with its graphics card launches, no matter the hype and anger at pricing of the Turing-based GeForce RTX 2080 Ti (my review here). They flew off shelves because gamers were, and will always be, thirsty for the best of the best. AMD hasn't had a best of the best graphics card for a very long time -- and at the time, the Radeon R9 Fury X was meant to be just that -- the best of the best.
And we all know how that played out.
The Radeon R9 Fury X was a $649 mess, muddled with pre-installed AIO cooling because the card ran so hot -- and a million other issues. It was a good performer, but it didn't keep NVIDIA up at night either. NVIDIA responded, and kept responding, pushing AMD into Vega.
We all know how Vega went.
AMD released the Radeon RX Vega 56 and Radeon RX Vega 64 (my review here) and the launch was so-so at best. They weren't popular, and really only dominated in cryptocurrency mining. Which, at the time, was absolutely exploding. The crazy HBM2 tech on the Vega GPU was not really utilized well, and then GDDR5X and eventually GDDR6 came and knocked its doors down.
NVIDIA kept responding, and kept responding -- and kept pushing harder down on AMD until the company nailed the next-gen console contracts. The semi-custom design win allowed AMD to have flex some of its muscle with RDNA and the Navi GPU architecture. The shift to GDDR6 was a big win for graphics card fans and enthusiasts, as well as gamers -- as it will be the lifeblood of the next-gen consoles and the next few waves of graphics cards.
NVIDIA is already pushing the boundaries of near 20Gbps on current GDDR6 tech, and we should expect AMD to get far closer to that with RDNA2-based Navi 2X graphics cards later this year.
But at the time, AMD pushed and pushed that HBM was going to be the savior of everything -- and that HBCC would knock our socks off.
Once again, we all know how that went -- it was all marketing BS.
Or was it?
HBCC has no benefits to PC gamers outside of a few benchmarks and resolutions where it provides a few % more performance, but now we're seeing something very similar play out on next-gen consoles. The mix of PCIe 4.0 connectivity with a super-charged NVMe SSD and super-fast GDDR6 acting as huge chunks of ridiculously fast cache? That sounds like HBCC... or something at least close to it.
NVIDIA is also rumored to unveil something called NVCache, which sounds much like HBCC, with its next-gen Ampere GPU-based GeForce RTX 3080 Ti -- and other RTX 3000 series cards, later this year when it unveils the new family of graphics cards in September 2020.
AMD after that shed most of its Radeon Technologies Group team -- a team that was created to shield themselves from AMD as a whole. The team disbanded after Vega didn't make the waves it promised after Fiji (and the Radeon R9 Fury X that I'm about to look at today).
RTG disbanded and most of them went off to work with Intel -- the largest of which being GPU architect Raja Koduri -- and its new adventures into discrete GPUs with its new Xe architecture. Some of those folks eventually left AMD to work for other companies after a few months, and now we have a new team at AMD that hasn't really had the band together -- and won't get the chance because of COVID-19.
AMD will have a mystery release later this year for RDNA2 (more on that here) and its new, and what should eventuate in the Radeon RX 6000 series cards. My sources have told me there will be no media event for this release, the first time in over a decade (or more) that no media event will be held for a new family of graphics cards being released.
Without further ado, let's move into the rest of our re-look at the Radeon R9 Fury X graphics card and see how it keeps up in 2020.
Last updated: Jun 8, 2020 at 01:46 am CDT
- Page 1 [Introduction & History Lesson on Fiji]
- Page 2 [History Lesson on Fiji]
- Page 3 [Benchmarks - Synthetic]
- Page 4 [Benchmarks - 1080p]
- Page 5 [Benchmarks - 1440p]
- Page 6 [Benchmarks - 4K]
- Page 7 [Benchmarks - Synthetic (Radeon Only)]
- Page 8 [Benchmarks - 1080p (Radeon Only)]
- Page 9 [Benchmarks - 1440p (Radeon Only)]
- Page 10 [Benchmarks - 4K (Radeon Only)]
- Page 11 [Temps & Power]
- Page 12 [Final Thoughts]