Intel multi-GPU tech: if done right, could hurt AMD and NVIDIA

Intel Xe multi-GPU could use your Intel CPU for boosted performance.

Comment IconFacebook IconX IconReddit Icon
Gaming Editor
Published
Updated
2 minutes & 30 seconds read time

An interesting rumor is now circling that excites the multi-GPU enthusiast inside of me, with Phoronix reporting that changes that Intel has made in the Linux 5.5 kernel tease multi-GPU improvements.

Intel multi-GPU tech: if done right, could hurt AMD and NVIDIA | TweakTown.com

The new section of code that takes care of the discrete GPU and integrated GPU on your Intel processor would combine their power, providing additional performance from the iGPU sitting on your CPU that is usually unused when the discrete graphics card is at work.

It might not seem like much, but with the improvements Intel continuously makes to its integrated graphics -- including the upcoming Xe GPU architecture (Gen12) this could be a big win when Intel launches its discrete Xe-based graphics cards. Intel Xe is rumored for an unveiling in mid-2020, just in time for Computex 2020.

Intel could see future Xe graphics card owners getting another 10-20% (or more, or less) performance in GPU-accelerated tasks (possibly not just games, but games would be the big target here) by using whatever is available from the integrated GPU on your Intel CPU.

This could give Intel an edge over its competitors, as AMD CPU + GPU combination do not work together and give you more performance. NVIDIA on the other hand doesn't even make x86 desktop CPUs like its GPU competitors in Intel and AMD so it's impossible for Team Green.

The worlds of multi-GPU gaming through SLI and now NVLink, as well as CrossFire have been dimming for years. I'm still a multi-GPU enthusiast but there are far too many issues with multi-GPU gaming that it makes me not bother (for the most part).

I can see a world where a mid-range Intel Xe graphics card (let's say RX 5700/RTX 2060 performance) has another 5/10/15/20% or more performance from the integrated GPU on an Intel CPU. This could help it beat the RX 5700 XT and even RTX 2060 SUPER/RTX 2070 in this case.

I'd want to see multiple Intel Xe graphics cards used with 100% scaling and no micro-stuttering... and if we see a move towards PCIe 5.0 then using 2, or even 3-4 of Intel's upcoming Xe graphics cards in a single rig could be awesome.

This is on Linux right now and not Windows, but if Intel flexes and moves towards multi-GPU and getting it to work really well, it could be a big win for Intel when Xe launches in 2020.

Here's what we know about the SKUs so far:

DG1 HW

  • iDG1LPDEV = "Intel(R) UHD Graphics, Gen12 LP DG1" "gfx-driver-ci-master-2624"

DG2 HW

  • iDG2HP512 = "Intel(R) UHD Graphics, Gen12 HP DG2" "gfx-driver-ci-master-2624"
  • iDG2HP256 = "Intel(R) UHD Graphics, Gen12 HP DG2" "gfx-driver-ci-master-2624"
  • iDG2HP128 = "Intel(R) UHD Graphics, Gen12 HP DG2" "gfx-driver-ci-master-2624"
Photo of the Intel Core i9-9900K Desktop Processor
Best Deals: Intel Core i9-9900K Desktop Processor
Country flagToday7 days ago30 days ago
$367 USD$385 USD
$625 CAD$625 CAD
£464.66£466.66
$367 USD$385 USD
* Prices last scanned on 3/27/2025 at 12:17 am CDT - prices may not be accurate, click links above for the latest price. We may earn an affiliate commission from any sales.
NEWS SOURCE:phoronix.com

Gaming Editor

Email IconX IconLinkedIn Icon

Anthony joined the TweakTown team in 2010 and has since reviewed 100s of graphics cards. Anthony is a long time PC enthusiast with a passion of hate for games built around consoles. FPS gaming since the pre-Quake days, where you were insulted if you used a mouse to aim, he has been addicted to gaming and hardware ever since. Working in IT retail for 10 years gave him great experience with custom-built PCs. His addiction to GPU tech is unwavering and has recently taken a keen interest in artificial intelligence (AI) hardware.

Related Topics

Newsletter Subscription