Computer & Tech Guides, Tips and How To's from TweakTown's Tweakipedia - Page 3
For the last three days, I have probably run the Grand Theft Auto V benchmark at least 300 times. Without a problem. I've been sitting here in my lab, running GTA V at 1080p, 1440p and 4K.
Rockstar has powered GTA V with its custom RAGE engine, which might be impressive for the consoles, but it's something else entirely on the PC. The studio has added in improvements to the engine that are just for the PC which include higher resolutions than the consoles are capable of, improved graphical details, denser traffic and pedestrians, better AI, new and improved weather and damage effects, and much more.
AMD and NVIDIA were quick to release their respective Grand Theft Auto V ready drivers, but during our testing we found that both sides of the VGA fence were great performers in single GPU solutions. But our SLI and Crossfire testing found NVIDIA coming out on top.
Grand Theft Auto V has been out on consoles for close to 18 months now, but the PC has finally gotten that beautiful taste of Rockstar Games' open world title on the PC, and now we can benchmark the game and go nuts with our hardware setups to see what you'll need to run it at specific resolutions and frame rates.
What do you need to run the game at 4K 60FPS? Not much, actually. We tested three of our VGA cards: the SAPPHIRE Radeon R9 290X 8GB Vapor-X, the GeForce GTX 980 and GeForce GTX Titan X (both reference) with surprising results. Each card was capable of 4K 60FPS in GTA V.
After a few weeks of playing around with our triple Acer XB280HK 28-inch 4K monitors on our various NVIDIA GPUs, such as the GTX 980s in SLI, the Titan X on its own and again with Titan X in SLI, we're now back with some AMD numbers to show you.
We had some reader feedback that wanted to see how the dual-GPU Radeon R9 295X2 would go in the ring with the GeForce GTX 980s in SLI and the Titan X in SLI at 6480x3840, and we're glad to report that for a card that is close to a year old, it did surprisingly well.
We've been getting some of our rigs up to speed in the last few months, throwing our VGA cards into different battles. We've been playing around with Titan X versus GTX 980, Titan X SLI versus GTX 980 SLI and everything in between.
The articles are only going to expand, where we wanted to see how two AMD Radeon R9 290X cards in Crossfire would perform against the Maxwell-powered cards from NVIDIA. We are using two of SAPPHIRE's Radeon R9 290X 8GB Tri-X cards, which are factory overclocked and feature an impressive aftermarket cooler.
Comparing the stock Radeon R9 290X against the R9 290X 8GB Tri-X cards from SAPPHIRE, we have the stock card with a GPU clock of 1000MHz versus the 1030MHz on the SAPPHIRE card. The company has overclocked the GDDR5 RAM from 1250MHz to 1375MHz. Not only that, but thanks to the better cooling setup, we can squeeze some more performance out of the SAPPHIRE Radeon R9 290X 8GB Tri-X, all while it continues to operate at decent temperatures.
When I first got the review sample of the GeForce GTX Titan X from NVIDIA, I was more excited than I had been in quite sometime. I've only been the VGA Editor for TweakTown for a little over three months now, but this was my first true beast of a video card to test. I had some interesting plans for it, and was immediately asking NVIDIA for a second card.
Just as NVIDIA's GPU Technology Conference was coming to an end, I was handed a second Titan X to take home for some truly mind-bending tests. Well, here we are back with our 4K Surround testing, where we're going to be pushing the boundaries of what the GM200 core can do at 6480x3840.
A few days ago we gave you a peek into what we've been working on here in the TweakTown VGA labs: NVIDIA's 4K Surround, or ~6K - a resolution of a huge, GPU busting 6480x3840. It's not easy to bust out this many pixels, but that's what we do here at TweakTown - continue to push the boundaries that no one else wants to do.
Sure, there are only a handful of people in the world that would be gaming on a 4K Surround setup, but that's what we do here - test the bleeding edge of 'real-world' gaming. Anyone with the money to spend on 4K Surround can go out and buy it, but more importantly, what type of GPU setup do you need to secure yourself around 60FPS average frame rate? This is what we 're here for.
We have been waiting quite a while for the stars to align to bring you this article, and many that will follow - NVIDIA's 4K Surround. We have 3 x Acer XB280HK monitors in a triple portrait set up - all running 3840x2160, or 4K. This provides a native resolution of 6480x3840.
Let's clarify that: 6480x3840. This means we're rendering 1,492,992,000 pixels per second. 1.4 billion pixels, every second. Compare this to 1920x1080 (Full HD, or 1080p) which is rendering 124,416,000, or 124 million pixels per second - the 4K Surround system is rendering over 10x that of the 1080p resolution.
Instead of writing about how many pixels are being rendered, we've put them into a chart so you can better understand just how many pixels we're driving here today. Right now, the 'next-gen' consoles are rendering games at around 720p - 900p, which if they were running at 60Hz (or 60FPS) which most of the time they aren't, it's usually 30FPS or so, they would be rendering 55 million pixels per second.
Our review sample of the GeForce GTX Titan X is sitting behind me as I type, and while we can't test it just yet because we don't have our hands on drivers, we're seeing a really nice sneak peak at the performance and it is mind blowing to say the least.
Before our review of the Titan X goes live, I thought I would run our GeForce GTX 980s in SLI through our benchmark suite and update our numbers. This will give us a better look at how the new Maxwell-based Titan X fares against the Maxwell-based 980s in SLI.
Seeing as though we can expect around 30-50% more performance over a single GTX 980, the Titan X should do quite well against even the GTX 980s in SLI. In VRAM limited situations (which we're going to get into with another article) the Titan X will trounce the competition thanks to its 12GB framebuffer.
Our review sample of the new NVIDIA GeForce GTX Titan X is here, so we decided to snap some high-res shots of the entire card to give you the best look at it yet.
If you didn't see our post on it, NVIDIA's founder and CEO, Jen-Hsun Huang, unveiled the card during Epic Games' GDC 2015 event. The company hosted its own 'Made to Game' event, but kept the Titan X under wraps for a few more hours. Even with GTC 2015 right around the corner, the company has shipped out Titan X samples to various people in the media, including TweakTown.
We've been playing around with our various GPUs here in the TweakTown VGA labs in the past few weeks, throwing them into battle against each other in various combinations. First, we started playing around by using an AMD-powered system with the AMD FX-9590 processor. We thought we'd play around with some Radeon GPUs, so we tested out the Radeon R9 290X 8GBs in Crossfire against the Radeon R9 295X2 dual-GPU card.
For our second test with the AMD CPU powering the system, we wanted to see how two NVIDIA GeForce GTX 980s in SLI would go against two Radeon R9 290X cards in Crossfire. Both of these tests were performed at 4K, so where can we go from here? Before we begin our testing with the Intel Core i7-5960X when it gets here in the coming weeks, we thought we would see how we'd go by really pushing the in-game details up to their maximum, and pitting all three set ups against each other. Oh yes.