Watch Dogs: Legion has just launched with some pretty great graphics for an open-world game, and inside of my reviews of NVIDIA's new GeForce RTX 30 series graphics cards -- and in the middle of my testing of the RDNA 2-based Radeon RX 6800 and Radeon RX 6800 XT graphics cards, I thought I'd benchmark Watch Dogs: Legion.
I've got a stack of cards tested, including AMD's new Radeon RX 6800 XT and Radeon RX 6800 graphics cards as well as NVIDIA's new GeForce RTX 3090, GeForce RTX 3080, and GeForce RTX 3070 graphics cards. I've thrown in a few older cards from both Team Red and Team Green, all at 1080p, 1440p, and 4K on Ultra graphics.
The game itself weighs in at 41GB but there is a HD texture pack that you can download along with it, coming in at an additional 15GB. I've used this pack in the benchmarking below.
Ubisoft is using an optimized Disrupt engine, something that was born from parts of the AnvilNext and Dunia Engine, which were game engines used in Assassin's Creed and Far Cry 2 and 3. Ubisoft is using the management of the open world city from the AnvilNext engine, while AI mechanics and vegetation from the Dunia Engine and Far Cry 2 and 3 are used in Disrupt.
What Ubisoft did with Watch Dogs: Legion was combine the open-world action-adventure side of the original Disrupt engine from the original Watch Dogs -- and then used the open world city management, as well as AI mechanics and vegetation from the AnvilNext and Dunia Engine into a tweaked Disrupt engine.
The tweaked Disrupt engine uses dynamic simulation to simulate physical interactions realistically, and seamless online connectivity. Ubisoft Montreal also uses the tweaks in Disrupt for cross-gen and cross-platform play of Watch Dogs: Legion.
Disrupt uses DirectX 9 and DirectX 11, and is also capable of dynamically simulating particles and clutter, that get blown around by wind. Meanwhile the Disrupt engine can also simulate electricity, which is another great tweak to the engine that lets players create city-wide blackouts.
Test System Specs
Sabrent sent over their huge Rocket Q 8TB NVMe PCIe M.2 2280 SSD, which will be my new Games install SSD inside of my main test bed.
I've got a new upgrade inside of my GPU test bed before my change to a next-gen test bed, where I will be preparing for NVIDIA's next-gen Ampere graphics cards and AMD's next-gen RDNA 2 graphics cards.
Sabrent helped out with some new storage for my GPU test beds, sending over a slew of crazy-fast Rocket NVMe PCIe M.2 2280 SSDs. I've got this installed into my GPU test bed as the new Games Storage drive, since games are so damn big now. Thanks to Sabrent, I've got 2TB of super-fast M.2 PCIe 3.0 x4 SSD storage now.
Anthony's GPU Test System Specifications
I've recently upgraded my GPU test bed -- at least for now, until AMD's new Ryzen 9 5950X processor is unleashed then the final update for 2020 will happen and we'll be all good for RDNA 2 and future Ampere GPU releases. You can read my article here: TweakTown GPU Test Bed Upgrade for 2021, But Then Zen 3 Was Announced.
- CPU: AMD Ryzen 7 3800X (buy from Amazon)
- Motherboard: ASUS ROG X570 Crosshair VIII HERO (buy from Amazon)
- Cooler: CoolerMaster MasterLiquid ML360R RGB (buy from Amazon)
- RAM: G.SKILL Trident Z NEO RGB 32GB (4x8GB) (F4-3600C18Q-32GTZN) (buy from Amazon)
- SSD: Sabrent 2TB Rocket NVMe PCIe 4.0 M.2 2280 (buy from Amazon)
- PSU: be quiet! Dark Power Pro 11 1200W (buy from Amazon)
- Case: InWin X-Frame 2.0
- OS: Microsoft Windows 10 Professional x64 (buy from Amazon)
Benchmarks - 1080p
1080p - Performance Thoughts
Right out of the gate we have AMD throwing down the performance gauntlet, with its not-even-flagship Radeon RX 6800 XT graphics card destroying the competition in Watch Dogs: Legion at 1080p. We have 102FPS average, compared to the competing card from NVIDIA in the GeForce RTX 3080 with 89FPS, hell it beats the $1499 RTX 3090.
You can get 60FPS average at 1080p on everything between the RTX 2060 SUPER and GTX 1080 Ti, while the older Vega 64 and GTX 1080p drop below 50FPS. With some detail tweaks you can hit 60FPS pretty easily in Watch Dogs: Legion at 1080p.
Benchmarks - 1440p
1440p - Performance Thoughts
Cranking things up to 1440p the GeForce RTX 3090 claws its way up to the top, with 84FPS average compared to the Radeon RX 6800 XT behind it with 79FPS -- and the RTX 3080 behind again with 76FPS average.
Keeping above 60FPS (at least with Ultra details on) will require the GeForce RTX 3070, while with some tweaking to in-game details you'll probably hit 60FPS on the RTX 2060 SUPER, GTX 1080 Ti, Radeon RX 5700 XT, and the RTX 2070 SUPER.
Benchmarks - 4K
4K - Performance Thoughts
NVIDIA comes out on top at 4K in a big way, with the GeForce RTX 3090 bursting way out and ahead with 57FPS average -- getting close to that 4K @ 60FPS mark you want in a game like this. The GeForce RTX 3080 sits at 51FPS, while the Radeon RX 6800 XT only manages 47FPS here.
Half the stack of cards can't run 4K here at 60FPS, but you can easily knock some of those details down and get 30-60FPS on some of the lower-end cards on the market in Watch Dogs: Legion.
The original Watch Dogs before it was watered down when it was released, looked great -- with mods it looked even better, and now Watch Dogs: Legion does a great job at building a great-looking world for an open-world game.
If you want to play the game at 4K then you're going to need on the Team Red side of things, at a minimum the Radeon RX 6800 XT. NVIDIA wise you're going to want the GeForce RTX 3080 or GeForce RTX 3090, but also remember you've got DLSS technology that you can turn on (I've got some numbers on those coming in a follow-up piece, still benching as I type).
Watch Dogs: Legion is much easier to run at above 60FPS at 2560 x 1440, while 1080p is even easier -- but the entry point for 30FPS is something like the GeForce GTX 1080 for 1440p and 1080p.
The game looks great, and runs on a PC that could be a few years old without a problem -- you might need to knock down the resolution to 1080p and details to Medium, but Watch Dogs: Legion will run. But I do wonder how it'll run at 8K... maybe I will try that once I'm finished with this growing stack of graphics cards.