Graphics Cards
Stay updated with expert analysis on the latest GPU and graphics card news, covering NVIDIA GeForce, AMD Radeon, Intel Arc, performance benchmarks, gaming, AI acceleration, and releases.
As an Amazon Associate, we earn from qualifying purchases. TweakTown may also earn commissions from other affiliate partners at no extra cost to you.
AMD FSR SDK 2.1 with FSR Redstone technology available now for developers
AMD's FSR 'Redstone' update is now available for RDNA 4 owners to play with, improving image fidelity and performance across a wide range of games, including Call of Duty: Black Ops 7, which also features the brand-new FSR Ray Regeneration technology that improves ray-tracing lighting effects. Alongside the debut of FSR 'Redstone,' AMD has also released the AMD FSR SDK 2.1, now available on GPUOpen for all game developers.
AMD FSR SDK 2.1 includes the full FSR 'Redstone' suite of neural rendering technologies: AMD FSR Upscaling 4.0.3, AMD FSR Frame Generation 4.0, AMD FSR Ray Regeneration 1.0, and AMD FSR Radiance Caching (Preview).
As introduced with AMD FidelityFX SDK 2.0 (AMD has renamed all FidelityFX tech to the simple AMD FSR), once a game adds new FSR 'Redstone' technologies, future AMD Software: Adrenalin Edition driver releases for Radeon GPUs can automatically update the version of ML or AI-based technologies used.
AMD Radeon Adrenalin 25.12.1 Drivers with FSR Redstone for Radeon RX 9000 Series is here
AMD Software: Adrenalin Edition 25.12.1 is here, and it adds official support for FSR 'Redstone,' which launched earlier today. For a breakdown on everything you need to know about FSR 'Redstone,' check out our main story on the technology, which includes the new AI-powered FSR Upscaling, FSR Frame Generation, FSR Ray Regeneration, and FSR Radiance Caching - which now fall under the AMD FSR banner.
As we've seen with the launch of FSR 4 earlier this year, which has been renamed to FSR upscaling (ML), the FSR 'Redstone' suite is exclusive to the latest RDNA 4 and Radeon RX 9000 Series graphics cards. With over 200 games set to add FSR 'Redstone' by the end of the year, AMD FSR just got its most significant update since the technology first debuted.
The new AMD Software: Adrenalin Edition 25.12.1 drivers for Radeon GPUs are available for all RDNA gamers, as they also include some fixes for existing issues, alongside adding support for new workstation GPUs, the Radeon AI PRO R9600D and Radeon AI PRO R9700S. Here are the full Release Notes.
NVIDIA celebrates 5th anniversary of Cyberpunk 2077 with RTX 2080 Ti GPU signed by Jensen Huang
NVIDIA is celebrating the 5th anniversary of Cyberpunk 2077 by giving away an ultra-rare GeForce RTX 2080 Ti Cyberpunk 2077 Edition graphics card personally signed by NVIDIA CEO Jensen Huang.
This is all part of NVIDIA's huge GeForce Holiday 2025 giveaway event, where it has listed not one but two custom GeForce RTX 2080 Ti Cyberpunk 2077 Edition graphics cards, with one of them signed by Jensen. The entries are done through NVIDIA's various GeForce social channels, with NVIDIA running anniversary and CES-themed prompts.
CD PROJEKT RED launched the PC version of Cyberpunk 2077 five years ago now if you can believe it, being one of the key showcases of NVIDIA RTX technologies including 4 ray-traced effects including shadows, reflections, ambient occlusion and diffuse illumination, as well as DLSS 2.0 upscaling at the time.
AMD's FSR 3 and FSR 4 are now both called FSR Upscaling, but there's a catch
AMD lifted the lid and officially launched its new suite of AMD FSR 'Redstone' technologies today, and with that, it has decided to simplify the naming. FSR originally stood for FidelityFX Super Resolution, an upscaling technology created in response to NVIDIA's AI-powered DLSS Super Resolution. With FSR 'Redstone' introducing new AI-powered technologies and features, AMD FSR now refers to all FSR technologies - past, present, and future.
And when it comes to Super Resolution, FSR 3 and FSR 4 now fall under the same branding - FSR Upscaling. Simple. Well, not exactly, as the new AI-powered FSR 'Redstone' technologies are exclusive to the latest RDNA 4 generation of GPUs - the Radeon RX 9060 XT, RX 9070, and flagship RX 9070 XT. This means there are two versions of FSR Upscaling: the newer RDNA 4-exclusive FSR Upscaling (ML) and the older FSR Upscaling (Analytical).
It's a change that, on paper, makes sense because AMD's Adrenalin drivers and software, or a game with native FSR support, will automatically choose the version depending on the Radeon GPU you have. In a way, it's reminiscent of how NVIDIA has handled its new upgraded Transformer model for DLSS Super Resolution, except the Transformer and older CNN model work across all GeForce RTX graphics cards.
AMD FSR Redstone is here, AI-powered Upscaling and Frame Generation for RDNA 4
AMD's FSR has entered a new era with the arrival of FSR 'Redstone' for the RDNA 4 generation of graphics cards, led by the flagship Radeon RX 9070 XT. By leveraging machine learning (ML) and AI, FSR 'Redstone' is a suite of four new technologies designed to enhance the performance and fidelity of PC gaming, beginning with the arrival of FSR 4's impressive AI-powered Super Resolution (FSR Upscaling) earlier this year.
And with that, FSR 4 is now called AMD FSR Upscaling (ML), powered by a new ML-based algorithm that delivers a dramatic improvement in image quality when upscaling from lower resolutions to 1080p, 1440p, or 4K. We've already covered this extensively in our reviews of various RDNA 4 GPUs like the Radeon RX 9060 XT, RX 9070, and RX 9070 XT, and it's a game-changer when compared to previous FSR versions like FSR 3 and FSR 4.
As part of AMD's big FSR 'Redstone' reveal, the company has confirmed that FSR Upscaling (ML) is exclusive to FSR 4, and that previous Radeon generations, including RDNA 2 and RDNA 3, will use the older non-AI version, now called FSR Upscaling (Analytical). This brings us to the next FSR 'Redstone' technology making its debut with today's announcement - AMD FSR Frame Generation (ML).
Intel Arc B770, aka Big Battlemage, could be a 300W GPU
The Intel Arc B770, also referred to as Big Battlemage, has been quietly rumored to be on the way for almost a year now. Powered by the BMG-G31 chip with 32 Xe2 cores and 16GB of GDDR6 memory, it's expected that the Arc B770 would be a mid-range alternative to something like the GeForce RTX 4070 or RTX 5070 in much the same way the Arc B580 is a mainstream alternative to the GeForce RTX 4060.
In addition to Intel confirming the BMG-G31 chip in a recent software update, rumors about the Intel Arc B770 are ramping up, with new custom listings suggesting the mid-range Battlemage GPU could feature a TDP as high as 300W. This would be surprising, as the Intel Arc A770 featured a 225W TDP, and the Intel Arc B770's GPU would use a more efficient 5nm process.
Now, the custom listings refer to a shipment of GPU brackets, with similar codes and naming that were spotted before the launch of the Intel Arc B580. So even though "N38341-001 TASDK 300W GPU BRKT 0.8" doesn't specifically mention the Arc B770 or BMG-G31 GPU, it does match the previous naming utilized for desktop Arc products.
Continue reading: Intel Arc B770, aka Big Battlemage, could be a 300W GPU (full post)
Intel confirms that the more powerful 'Big Battlemage' GPU exists, Intel Arc B770 coming soon?
The Intel Arc B770, powered by the BMG-31 GPU, is commonly referred to as "Big Battlemage" and seen as the flagship mid-range desktop companion to the mainstream-focused Intel Arc B580 and Arc B570. Initially rumored to launch earlier this year, we've finally got some indication from the company that 'BMG-G31' is finally ready to launch.
As part of the latest update to the company's Intel VTune Profiler performance analysis tool for both Windows and Linux, Intel confirms support for new hardware - specifically "Intel Arc Battlemage (BMG-G31) and Intel Core Ultra 3 Processors (Panther Lake)." BMG-G31 is the larger desktop GPU variant of the BMG-G21, which powers the Intel Arc B580 and Arc B570. But even so, Intel has yet to announce or confirm the Intel Arc B770 formally.
As the flagship desktop variant of the 'Battlemage' lineup and architecture, BMG-G31 features up to 32 Xe2 cores and 16GB of GDDR6 memory on a 256-bit bus, with up to four SKUs reportedly in the works. Although it's great to see the BMG-G31 GPU pop up like this, the current DRAM shortage would make it a strange time to release a new 16GB GPU for PC gaming and AI.
NVIDIA restores 32-bit PhysX support for GeForce RTX 50 Series graphics cards
When NVIDIA launched the RTX Blackwell-powered GeForce RTX 50 Series earlier this year, it was discovered that the company quietly dropped 32-bit support for CUDA. This meant that older titles that used the company's PhysX acceleration for in-game physics offloaded the physics calculations to the CPU, significantly impacting performance.
At the time, some users noted that playing a game like Borderlands 2 with PhysX enabled saw performance drop to below 60 FPS in 4K on the GeForce RTX 5090, when the RTX 4090 could easily push 120 FPS. Well, there's some good news on this front: NVIDIA has reinstated 32-bit GPU-accelerated PhysX support for select games on GeForce RTX 50 Series GPUs with the latest GeForce Game Ready 591.44 WHQL driver release.
"We heard the feedback from the community, and with the launch of our new driver today, we are adding custom support for GeForce gamers' most played PhysX-accelerated games, enabling full performance on GeForce RTX 50 Series GPUs, in line with our existing PhysX support on prior-generation GPUs," NVIDIA writes, confirming that it's reinstating support one game at time, with a focus on the most enduring titles from the 32-bit PhysX era.
GeForce Game Ready Driver 591.44 WHQL released, fixes the recent Windows 11 performance bug
GeForce Game Ready Driver 591.44 WHQL is here, adding new game support for the Battlefield 6: Winter Offensive update and Call of Duty: Black Ops 7, which both feature DLSS 4 with Multi Frame Generation. Although Call of Duty: Black Ops 7 support was included in the previous driver release, this update improves the fidelity of DLSS Ray Reconstruction when ray-traced lighting is enabled.
But even if you're not playing either of these games, it's a critical driver release for Windows 11 users as it fixes a Microsoft bug that tanks performance in some games by up to 50%. This issue was recently resolved in a hot fix driver release and has now been included in the official WHQL branch available for download from NVIDIA's main GeForce driver hub or via the NVIDIA App.
Per the full release notes, this new driver fixes the issue where "Users running R580 branch drivers (58x.xx) or newer may observe lower performance in some games after updating to Windows 11 October 2025 KB5066835." The new driver fixes several issues affecting game performance and reinstates 32-bit GPU-accelerated PhysX support for the new GeForce RTX 50 Series.
Radeon RX 9000 Series price increases confirmed, second price increase coming January 2026
According to a new report from Tom's Hardware, citing an industry source, AMD has increased the prices of its Radeon graphics cards for the US market. AMD's AIB partners are now spending $10 more per 8GB of VRAM for GPU and memory kits, with these costs expected to be passed on to consumers.
For AMD's current RDNA 4-powered Radeon RX 9000 Series, this could see the MSRP for most models increase by $20. With this price increase, the flagship Radeon RX 9070 XT 16GB will see its $599 MSRP increase to $619, the Radeon RX 9070 16GB will see its $549 price increase to $569, the Radeon RX 9060 XT 16GB will see its $349 price increase to $369, and the Radeon RX 9060 XT 8GB will see its $299 price increase to $309.
AMD's Radeon graphics cards all use GDDR6 memory, and with the current crisis, the cost of obtaining modules for its consumer-focused GPUs has increased dramatically. Although the report doesn't specify a reason for the price increase, at this point it's almost self-evident.
Redditor orders RTX 5080 for $1200, gets a box of rocks instead: Best Buy won't refund it
A user on Reddit has said that he purchased a new ASUS TUF GeForce RTX 5080 graphics card through Best Buy on November 25, but all he got was a box of rocks instead... and even worse, Best Buy has totally fumbled in its response and "investigation" denying a refund or replacement of the $1200 order.
In a new post on Reddit from u/GnarDead an unfortunate user ordered his RTX 5080 on 11/25, but when he received it a few days later, says he was "blown away by how irresponsibly this thing was shipped". He explained that the shipping labels were "just slapped" on the retail packaging, no generic brown box to conceal the item, and the seal itself was "clearly tampered with". Inside, there were 4 rocks where his new RTX 5080 GPU should've been.
The Redditor says that he filed a claim through Best Buy customer service within an hour of receiving his $1200 package filled with rocks, and was told by Best Buy staff that he would be getting a replacement. But fast forward to December 2 -- a week later now -- and he just got an email from Best Buy saying that it wouldn't be replacing or refunding the $1200 order after their "investigation".
$4,000 ASUS ROG Matrix RTX 5090 'supreme' GPU reportedly delayed due to a quality issue
The ROG Matrix RTX 5090 from ASUS is a limited-edition graphics card that "stakes its claim for GPU supremacy" according to the board maker, but those who've paid a ton of money for the card are reportedly facing a delay in it being shipped.
It's undoubtedly a powerful graphics card, of course, but some buyers who have ordered the ROG Matrix RTX 5090, which has officially arrived - but is currently sold out - are wondering where their prized GPU is.
As VideoCardz reports, on the Republic of Gamers forum, one person who ordered the flagship GPU (pre-orders went live on November 19) noted they haven't got it yet, and wanted to hear from others who had actually received their card.
Is there hope for Intel's desktop GPUs yet? Arc graphics cards hit market share milestone
Intel's Arc desktop graphics cards have secured 1% of the discrete GPU market according to the latest statistics from an analytics firm.
As Tech PowerUp reports (via VideoCardz) this is Jon Peddie Research (JPR) which keeps close tabs on GPU shipments - for the global discrete market (and I'll come back to that in a moment) - and while 1% might not sound like much of an achievement, it's a milestone of a move of the needle away from zero.
Previously, Intel has had a sub-1% market share according to JPR's statistics, and hovered around the half-a-percent mark, but with a 0.4% boost in Q3 2025, Arc discrete graphics cards now account for a full 1%.
AMD will reportedly increase the price of 8GB Radeon GPUs by $20 and 16GB GPUs by $40
With memory prices surging due to shortages and unprecedented demand from the AI and data center markets, it has been widely rumored that desktop graphics card prices for PC gaming are set to increase. And now, with the latest report from the Board Channels forum in China, we've got word that AMD is set to increase the pricing for all of its Radeon GPUs based on VRAM capacity.
Board Channels regularly breaks news relating to pricing and stock levels from AMD and NVIDIA's partners, so chances are this is accurate information. According to the post, the "first wave" of Radeon GPU price increases will see 8GB GPUs cost $20 more, and 16GB GPUs cost $40 more. This refers to the GPU bundles (chips and memory) that AMD sells to its partners for packaging and pairing with boards and coolers.
The flow-on effect will see retail prices for AMD's Radeon GPUs increase by around 300 RMB (around $40 USD) and 600 RMB (around $85 USD) by the end of the year. Although no specific Radeon GPU models are mentioned in the post, it is expected to apply to AMD's full RDNA 4 lineup, which includes the entry-level Radeon RX 9060 XT 8GB and the flagship Radeon RX 9070 XT 16GB GPUs.
NVIDIA is giving away a custom ARC Raiders themed GeForce RTX 5090
The GeForce RTX 5090 is not only the most powerful gaming GPU on the market, but also the most premium, with 32GB of fast GDDR7 memory. Looking at current retail prices for NVIDIA's GeForce RTX 50 Series, the GeForce RTX 5090 is essentially triple the price of the RTX 5080, with prices at around $3,000. An eye-watering amount, and we're only pointing this out because it makes any GeForce RTX 5090 giveaway worth checking out.
As part of its holiday-themed Season of RTX event, NVIDIA has partnered with Embark Studios to give away a custom ARC Raiders-themed GeForce RTX 5090 Founders Edition graphics card. As one of the most popular games of 2025, ARC Raiders' multiplayer extraction gameplay has struck a chord with gamers across platforms thanks to its blend of exploration, shooting, and extraction mechanics in an immersive sci-fi setting.
For your chance to win the ARC Raiders GeForce RTX 5090 giveaway, simply head to one of NVIDIA's GeForce Facebook, Instagram, or X social media pages, find the relevant post, and follow the steps. This giveaway arrives alongside the company's latest DLSS roundup, which sees the AI suite of technologies arrive in two more games.
Continue reading: NVIDIA is giving away a custom ARC Raiders themed GeForce RTX 5090 (full post)
GeForce RTX 5070 hits new milestone, it's now one of the Top 10 gaming GPUs on Steam
Valve's Steam Hardware & Software Survey results for November 2025 are in, and when it comes to discrete gaming GPUs, NVIDIA's GeForce lineup continues to dominate the field. However, when it comes to the company's new GeForce RTX 50 Series, the GeForce RTX 5070 is once again proving to be the most popular current-gen graphics card as it surpasses another milestone.
The GeForce RTX 5070 is now the tenth-most-popular discrete gaming GPU according to the latest Steam Hardware & Software Survey results, cracking the Top 10, so to speak. Interestingly enough, it knocked out the previous generation's GeForce RTX 4070 to claim the tenth spot, recording one of the most significant market share jumps for the month.
And if it continues to grow, we could see the RTX 5070 surpass the RTX 3070 in next month's results, which is one of NVIDIA's most popular and enduring 70-class GeForce graphics cards.
Linus Torvalds's 'perfect Linux PC' has Intel B580, not AMD GPU - but neither were first choice
Linus Torvalds threw a bit of a curveball when building his 'perfect Linux PC' by choosing an Intel Arc discrete GPU.
As VideoCardz reports, this was a feature run by Linus Tech Tips (LTT) - yes, this interview is Linus squared, essentially - with the YouTuber producing an Intel Arc B580 when it came to the selection for the graphics card in the Linux computer.
As Torvalds observes, he was 'famously' not a big fan of NVIDIA, and so there was never going to be a GeForce GPU inside the PC.
Der8auer pulls over 750W on 12V-2x6 cable on ASUS ROG MATRIX RTX 5090: no BIOS or shunt mods
Overclocker "Der8auer" has shown how to run more than 600W of power through the 16-pin 12V-2x6 power connector without the need of BIOS flashes or shunt mods on the ASUS ROG MATRIX GeForce RTX 5090 graphics card. Check it out:
Der8auer used a modified BTF power adapter that is included with the ROG MATRIX RTX 5090, focusing his efforts into the MATRIX power configuration, with the GPU shipping with an 800W BIOS that requires both the 12V-2x6 power connector, and the ASUS BTF edge power connector.
The overclocker tried both ways, first using the 12V-2x6 power connector which limits the card to the 600W board power, and the same limitations apply for the GC-HWPR adapter. Der8auer used a riser and a cut-down test bench, probing the BTF adapter contact pins, identifying the shorter "presence" pin and ground pin, and first bridges those to the adapter.
ASUS's monster 2002W XOC BIOS leaks, but this bad boy XOC BIOS isn't for everyone
ASUS has a monster 2002W custom XOC BIOS that was made for its ROG Astral GeForce RTX 5090D graphics card, which has now leaked online and you can use it... but we wouldn't recommend it.
In a new post on the Overclock3D.net forums, user "Carillo" has just posted the ASUS 2002 XOC BIOS file, with some users already playing around with it on their expensive GeForce RTX 5090 graphics cards. The XOC BIOS was originally made for ASUS's ROG Astral RTX 5090D card for the Chinese market (which has been discontinued already), with the extreme XOC BIOS made for extreme overclockers, and not for the mass market as there are (incredible) risks using it.
There is up to 600W of power flowing into an RTX 5090 as it is, and even with this custom XOC BIOS, you won't be automatically chowing down on 2000W of power into your RTX 5090. Previously, we've seen YouTuber "JayzTwoCents" flash this 2000W XOC BIOS onto his GIGABYTE AORUS Master RTX 5090, with the GPU using close to 900W of power, and delivering only another 10% performance.
NVIDIA stops bundling VRAM chips with GPU dies: tells AIBs to source their own GDDR chips
NVIDIA is reportedly telling its AIBs to get their own GDDR memory chips, as the company will only be supplying its GPU die without VRAM chips moving forward, and this has some dire consequences moving forward.
NVIDIA sources its VRAM memory modules from SK hynix, Samsung, and Micron, but those DRAM manufacturers have apparently had to also fulfill the memory demand from the unstoppable AI boom. The issue has caused skyrocketing RAM prices, but has spiraled into the point where NVIDIA can't get enough GDDR (GDDR7 on the RTX 50 series) memory chips for itself -- in order to bundle with its GPU dies -- that it's forcing its AIB partners to get the GDDR memory chips themselves.
In a new post from leaker "Golden Pig Upgrade" who has reported that NVIDIA has stopped bundling GDDR memory chips with GPU dies, he said: "The main previous AICs core memory was the old yellow bag circle. Now it is said on the internet that it is only for the core, and the memory is self-collected by AIC. For the small AIC, there was no connection before, and now people who talk about video memory don't give a shit about you at all, which is tantamount to not having to do video card business".





















