I've already benchmarked Cyberpunk 2077 at the beyond gloriously beautiful 8K resolution, with NVIDIA's flagship and brute power GeForce RTX 3090 graphics card with 24GB of GDDR6X memory.
In those benchmarks, I used NVIDIA's GeForce RTX 3090 graphics card and ran 8K with ray tracing both enabled and disabled, as well as DLSS (Deep Learning Super Sampling) both enabled and disabled. There were some great results with the GeForce RTX 3090 hitting 31FPS average at 8K with DLSS on "Ultra Performance" mode.
DLSS will harness the power of AI to super-boost performance in-game, letting you run resolutions like 4K at 60FPS or even 120FPS in the recent DLSS 2.0-powered update for Call of Duty: Warzone. But at 8K you well and truly need the help of DLSS in Cyberpunk 2077, but what AMD? What about Radeon RX gamers?
Well, that's why we're here today -- DLSS is NVIDIA's secret sauce GPU black magik and AMD doesn't yet have a DLSS competitor but the rumored "FSR" or FidelityFX Super Resolution is coming later this year. For now, Cyberpunk has FidelityFX CAS (Contrast Adaptive Sharpening) with both "Dynamic" and "Static" presets.
It's not as good as DLSS, but it definitely helps performance at 4K and 8K in Cyberpunk 2077 on AMD Radeon RX 6000 series graphics cards. NVIDIA is well and truly ahead with its AI upscaling technology, especially with DLSS 2.0 enabled games like Call of Duty: Warzone and Cyberpunk 2077. DLSS 2.0 enables NVIDIA to be the clear leader in ultra-performance and ultra resolutions like 8K with bleeding-edge games like Cyberpunk 2077.
I did run into many more problems using CAS on my Radeon RX 6900 XT than I did by a long, long shot on the GeForce RTX 3090 with DLSS enabled. There were visual artifacts at times, slowdowns, crashes, and a few other bugs that created headaches during my re-testing of Cyberpunk 2077 at 8K.
- Read more: Metro Exodus: Enhanced Edition Benchmarked at 8K
- Read more: Call of Duty: Warzone DLSS Benchmarked: 8K 60FPS on GeForce RTX 3090
- Read more: Cyberpunk 2077 Benchmarked at 8K: Future GPU Technology Required
- Read more: Death Stranding Benchmarked at 8K: DLSS 2.0 = GPU Cheat Codes
- Read more: Microsoft Flight Simulator Benchmarked at 8K: The New Crysis
- Read more: Gears 5 Benchmarked at 8K: eats up $2499 NVIDIA TITAN RTX
I will go into the issues I had later on in the article, but in this testing I was only using the ultra-enthusiast flagship Radeon RX 6900 XT graphics card (reference from AMD) while I was using the tweaked MSI GeForce RTX 3090 SUPRIM X graphics card. It's barely a smidge (less than 1-2FPS at 8K) faster than NVIDIA's own GeForce RTX 3090 Founders Edition.
But it's not only the big GPU battle between DLSS and CAS technologies from NVIDIA and AMD... but ray tracing dominance.
Cyberpunk 2077 recently had a patch that enabled ray tracing. But man, does ray tracing performance suck on Radeon right now -- NVIDIA has the clear upper hand with its Ampere GPU architecture and far superior RT performance, while RDNA and even RDNA 2 don't come close.
Not even with CAS enabled and cranked to maximum does the Radeon RX 6900 XT compare to the GeForce RTX 3090 when they're running at 7680 x 4320... 8K is hard work for any GPU but with ray tracing enabled too? Sheesh. GPU overload.
Enough with the prelude, let's dive right into Cyberpunk 2077 benchmarked at 8K with DLSS or CAS enabled, as well as ray tracing on both the flagship AMD Radeon RX 6900 XT and NVIDIA GeForce RTX 3090 graphics cards.
Cyberpunk 2077 Graphics Settings + Enabling DLSS or CAS
I had every graphical bell and whistle enabled in Cyberpunk 2077, using a custom preset and turning everything up to max. This includes the texture quality to High, the field of view to 100, and motion blur gets disabled of course. I had all of the shadows, fog, cloud, decals, ambient occlusion, and level of detail max out on High or Ultra settings across the board.
When it comes to the ray-tracing side of things in Cyberpunk 2077, that's where I once again enabled ray tracing and then turned on ray-traced reflections, ray-traced shadows, and set ray-traced lighting to the "Psycho" preset.
Cyberpunk 2077 supports AMD FidelityFX CAS in both Dynamic and Static forms, where you can have a dynamic resolution that you can set a minimum and maximum resolution for. The minimum setting for Dynamic FidelityFX CAS is 50% while the maximum is 100%.
There's also Static FidelityFX CAS which when enabled, will disable Dynamic FidelityFX CAS and then gives you the option of a resolution scaling slider that you can tweak between 50% and 100%. I opted with Static FidelityFX CAS and ran it disabled, at 75% and 50% with some great results coming out of it.
Here's the graphics settings that I used:
Test System Specs
After MSI asked if I wanted to do some testing with their GeForce RTX 30 series cards and the Intel Core i9-10900K and MSI MEG Z490 Unify motherboard, I asked if they would like me to use other MSI products in the article.
They agreed, so the CPU cooler, PSU and case were on their way and the upgrade began.
MSI sent over their Z490 Unify motherboard, which is the heart and soul here -- taking in our Core i9-10900K processor, super-huge Sabrent Rocket Q 8TB NVMe SSD, and G.SKILL Trident Z Royal RAM.
I think I was most excited for MSI to send me their huge MPG Sekira 500X case, as this started off with a 'hey, do you want an MSI Z490 Unify motherboard to do some testing on' into a full-on MSI-exclusive system. The MSI MPG Sekira 500X is a freaking hefty beast of a case, but man does it keep the system nice and cool -- and quiet -- oh and it looks dope, too.
I've got a hefty MSI MAG CORELIQUID 360R AIO liquid cooler keeping the Intel Core i9-10900K nice and chill, installed onto the MSI Z490 Unify motherboard. I have to say this is my first time using an MSI AIO cooler and I love it, the design really looks great inside of the machine.
MSI wanted me to use their MPG A850GF power supply, which is another first for me -- never used an MSI PSU before, and neither have you I'm sure. MSI has only just entered the PSU market, so I've been using this 850W unit for all of the testing on this new MSI Z490 Unify + Core i9-10900K system.
On the graphics side of things I'm using the RTX 3090/3080 FE cards as well as the MSI RTX 3090/3090 SUPRIM X graphics cards -- two of the fastest Ampere cards on the market.
One of the most insane parts of this new MSI system is the Rocket Q NVMe 8TB SSD that our good friends at Sabrent sent over -- yeah a huge 8TB super-fast NVMe SSD that pumps away at 3GB/sec.
- Read more: Sabrent Rocket Q NVMe 8TB SSD Review
G.SKILL helped in a big way by sending over some super-fast, and super-stylish Trident Z Royal DDR4 RAM. Installed into the MSI Z490 Unify motherboard is 16GB (8GB x 2) of G.SKILL Trident Z Royal DDR4-3600 memory (CL16-16-16-36).
Another important part of this rig is the flagship Intel Core i9-10900K processor, a beasty 10-core/20-thread CPU that rocks up to around 5GHz under gaming loads. Plenty for the testing we'll be doing with it -- and some 8K testing with this new system should push that 10900K to the limit.
- Read more: Intel Core i9 10900K CPU Review
Benchmarks - 8K
As you can see the GeForce RTX 3090 kills it at 8K in Cyberpunk 2077 with ray tracing enabled and DLSS set to Ultra Performance mode we have 31FPS average, compared to 28FPS average on the Radeon RX 6900 XT with Static FidelityFX CAS set to 50%. That's not a bad result at all from a performance perspective.
Turning DLSS to Performance mode and Static FidelityFX CAS @ 75% we see performance just as neck-and-neck as the Ultra Performance and Static FidelityFX CAS @ 50% results. The GeForce RTX 3090 is ahead again with 18.8FPS average, while the Radeon RX 6900 XT is behind with 16.8FPS average.
Disabling DLSS and CAS completely sees performance dive off a cliff, but much more so for the GeForce RTX 3090 with just 6.6FPS average, while the Radeon RX 6900 XT fares better (juuuust) with 10.6FPS average. Cyberpunk 2077 is totally unplayable without DLSS or CAS enabled.
It's Not All Scores & Numbers
I'm a performance guy, I love numbers -- I want to see each GPU release crushing the other with features, technologies, and most of all PERFORMANCE. But... while you have somewhat decent performance out of AMD's flagship Radeon RX 6900 XT graphics card with CAS enabled and seeing 28FPS average, up against the GeForce RTX 3090 with 31FPS -- it doesn't seem like the flagship Big Navi GPU is behind the flagship Ampere GPU by that much.
But there are issues with CAS + ray tracing + Radeon GPUs as I experienced flickering all over the place, performance would tank for 3-5 seconds or more before Cyberpunk 2077 was ready to go and benchmarking began. The visual glitches were way too much for me, to the point where I simply wouldn't play it like that... it's just too annoying.
NVIDIA on the other hand with DLSS enabled, is fine.
Cyberpunk 2077 with ray tracing enabled and running at even 4K let alone 8K is an ultra-gorgeous game, it truly is the new Crysis when it's up against the likes of Microsoft Flight Simulator.
If you want the ultimate experience playing Cyberpunk 2077 and own an 8K monitor or new HDMI 2.1-powered 8K TV then you will want to only play it on NVIDIA's flagship GeForce RTX 3090 graphics card. The GeForce RTX 3090 truly is the ultimate 8K gaming GPU.
I don't think we'll be playing Cyberpunk 2077 at 8K 60FPS until we see next-gen GPUs in 2022 from NVIDIA, which is a pity because it is one of the best-looking games ever made. CD PROJEKT RED probably never made the game with the thought that people would be running at 7680 x 4320 and rendering those insane 33 million pixels per second... unless we wait until the year 2077 for truly next-gen GPUs.
AMD finally steps up and into the ray-tracing world with Cyberpunk 2077, with the flagship Radeon RX 6900 XT falling just behind NVIDIA's flagship GeForce RTX 3090 at 8K with CAS enabled. It might be more glitchy than Team Green, but Team Red shows it's nearly ready for ray tracing primetime... which should hopefully come with RDNA 3.