Technology content trusted in North America and globally since 1999
8,227 Reviews & Articles | 62,343 News Posts

WTF is wrong with multi-GPU support in games these days?!

By: Anthony Garreffa | Editorials in Gaming | Posted: Apr 5, 2016 12:16 am

GPUs Are Already Insanely Fast


The GPU technology we have right now, without taking into consideration the next-gen GPUs that will launch in the coming months, is insane. Even the mid-range cards like the Radeon R9 290 and GeForce GTX 970 are great cards to throw into a multi-GPU setup.




Ramping up to something like the GTX 980 Ti in SLI or the R9 Fury in Crossfire, and you have a setup that can easily handle 4K gaming at 60FPS and beyond. We aren't at the crux of the problem, as we're thinking about the technology.



Thinking along the lines of 'can the technology get better?' or 'wait until next-gen' and you'd be wrong. The technology isn't the problem. NVIDIA and AMD are mostly not at fault here, and while they do sit at the table of the issue, they aren't the biggest contributing factor for why multi-GPU support in games sucks. We'll get to that in a bit. I'm not done establishing the basis for my multi-GPU gaming sucks rage-a-thon yet.



GPU Technology Is About To Go Next-Gen


So above, I said that one of the trains of thought would be to 'wait until next-gen' but that won't fix anything. If anything, it'll exacerbate the issue. We'll have infinitely faster GPUs thanks to NVIDIA's Pascal architecture and AMD's much touted Polaris architecture. But, the multi-GPU shit we put up with in games will still be there, and in fact - it'll be worse.




You would have to plonk down $1000+ for two next-gen, enthusiast video cards from either AMD or NVIDIA. When you get those two cards home (or delivered), you're going to want to put them through their paces - but how do you do that?


3DMark? Heaven? Wouldn't you want to throw your SLI/CF'd next-gen GPUs into a game like The Division, Far Cry Primal, Hitman, or Quantum Break (just to name a very select few) and enjoy unrivalled performance like the box, or marketing says? Yeah, you're shit out of luck there.


Most of the time, you're better off - and sometimes advised by the game developer, or sites like TweakTown - to take one of your video cards out, and use one for now. Do you know how much it pains me to have to write a news article, telling you to take out one of those $200, $300, $500 or, even more, expensive video cards out of your machine because there's no support for multi-GPUs in-game with a budget of $20 million, or sometimes more? Ugh.



Even Gaming Monitors Are At Their Peak


Heck, gaming monitors in 2015 totally exploded - and went right up to 2560x1440 @ 144Hz as the new "enthusiast/professional" gaming standard - while ASUS pushed boundaries by releasing their PG279Q monitor with a 2560x1440 native resolution and insane 165Hz refresh rate, all with NVIDIA's G-Sync technology on top.


For UltraWide fans like myself, the Acer Predator X34 became my gaming monitor of choice - with its beautiful native resolution of 3440xx1440. But it was the 100Hz refresh rate that got me, alongside NVIDIA's excellent G-Sync technology that made it my new monitor for, well, everything - gaming, working, and everything in between.

    PRICING: You can find products similar to this one for sale below.

    United States: Find other tech and computer products like this over at Amazon's website.

    United Kingdom: Find other tech and computer products like this over at Amazon UK's website.

    Canada: Find other tech and computer products like this over at Amazon Canada's website.

    We at TweakTown openly invite the companies who provide us with review samples / who are mentioned or discussed to express their opinion of our content. If any company representative wishes to respond, we will publish the response here.

Related Tags

Got an opinion on this content? Post a comment below!