The last time we let the legs stretch on our SAPPHIRE Radeon R9 290X Tri-X GPUs in CrossFire was when we tested them at 4K. Well, that was when we had our old LGA 1150 setup, but now we have a flashy new, super high-end LGA 2011 setup and we're going to run them through their paces, again.
The system in question has been provided by many of our closest partners in SanDisk, SAPPHIRE, Corsair, InWin and ASUS. Without them, this couldn't of been possible. The specs:
I have been a busy, busy man in the last couple of months - my wife and I have just had a second, healthy baby girl, on top of the whirlwind of a two-year-old daughter we have now, and I've been busy reorganizing and building a new Tweakipedia test bed, on top of my daily news duties and everything else I do around the site, Facebook page, etc.
Now that I have some spare time, I'd like to introduce you to our new test bed here at Tweakipedia. We had some help from some of our closest friends, where without them, this would not be at all possible. The old test bed can be checked out here, but the new test bed is quite the beast.
And for good measure, we all know technology upgrades at an increasingly faster rate, so I will tease you that this test bed is only good until around June or so, where it will be upgraded yet again with an even more bleeding-edge set of components.
Starting with the heart and soul of the new Tweakipedia test bed, we have an Intel Core i7 4930K CPU - this is something I personally purchased, it's not a cherry-picked CPU at all as I wanted an off-the-shelf CPU versus something that scales better than consumer CPUs.
A couple of days ago we did a preview piece on AMD's Mantle technology, using our trusty Haswell-based system, and Star Swarm as our benchmark. Well, our system has been changed around a little - well, a lot - a fresh installation of Windows performed, and an updated version of Star Swarm.
This is why we used the word "Preview" in our article - right there on the headline. AMD's Mantle technology is a constantly-evolving piece of technology, that heavily relies on the software (games) side of things for it to work well.
An update of Star Swarm was pushed out between us writing the original piece, and now - so now we're up to date, and back with a better system - and an AMD system - to test it with.
Bitcoin mining is big business these days, with people able to 'mine' a digital currency in their own homes. Before Bitcoin had its big take off, mining Bitcoin at home was actually lucrative - if you mined your coins, kept them, and sold them at their peak.
These days, 24 hours of mining will net you 0.00001 coins, which is just cents per day. Keeping in mind that it costs you per day to run your machine, so electricity costs have to be thrown into the mix, too. We're not going to go into that right away, but what we are going to do is skim over it a little, and give you our thoughts on a few weeks of mining digital coins - an experience I'll never forget.
I'm not one that usually can't get something working on my PC, so it annoyed me that I couldn't get Bitcoins mined on my PC right away. I started off trying to use GUIMiner, but in the end it just wouldn't work for me, so I swapped it out for the Java-based BitMinter, which worked first shot.
We've already tested our SAPPHIRE Radeon R9 290X Tri-X GPUs in Crossfire, with some truly insane power consumption numbers thanks to there being virtually no limit on overclocking, voltage, or power consumption. We saw 1050W of total draw for our Hawaii-based dual-GPU setup, but what about NVIDIA's GPUs?
I only have a couple of NVIDIA GeForce GTX 780s here in my office, so I decided to throw them on my test bed and see what power consumption numbers are against our R9 290X GPUs in CF. The results, are very surprising to say the least.
Not too long ago we tested our SAPPHIRE Radeon R9 290X Tri-X GPUs in CrossFire at 4K, with some very surprising results. The Tri-X cards are some of the fastest consumer GPUs on the market, with some crazy headroom for overclocking.
But, how does Team Green fare at 3840x2160? That's what we're here for today. We have two reference NVIDIA GeForce GTX 780 GPUs in SLI that we're going to test on our Seiki Digital 39-inch 4K TV.
We've already tested our SAPPHIRE Radeon R9 290X Tri-X GPUs in both 4K at stock, and overclocked, but the numbers when overclocked were not great, considering the mammoth memory bandwidth numbers we saw.
We were not that impressed with the added performance from the overclock, given the increase in audible noise; when those fans spin up, it can get quite loud.
We overclocked our already-overclocked R9 290X past its Tri-X settings, to an insane 6.2GHz memory (from 5.2GHz on the reference R9 290X). This resulted in some absolute benchmark-busting memory bandwidth. Increasing from 320GB/sec on the reference R9 290X, to a huge 422.4GB/sec on our overclocked Tri-X GPU.
Rewinding the clock a few years back, I didn't use Dropbox all that often - even the 'cloud', I never used often. I relied mainly on my Google account, and my QNAP NAS. When I started here at TweakTown, I needed something where I could access all of my work, on any machine, on any operating system, anywhere.
I jumped into Dropbox when I purchased by Galaxy S III, as Samsung were handing out 50GB of free Dropbox storage - so I thought I'd use it. From there, I've uploaded tens of gigabytes of data to the cloud, and have it synced across multiple machines, operating systems, and even on my NAS.
Your first step, of course, is to ensure you have a Dropbox account - if not, you can sign up here.
I'm going to presume you're using Windows here, so once you're signed into your Dropbox and have some files synced, you'll have a 'Dropbox' folder in your Users directory. This folder, is the folder that everything is stored in, and you'll access this constantly.
A couple of days ago I wrote an article, testing out two SAPPHIRE Radeon R9 290X Tri-X GPUs in CrossFire, at 3840x2160 - or 4K - or Ultra HD. We had some surprising results, but I thought we could overclock them, and see what we can find.
Firstly - hot DAMN do these Tri-X GPUs overclock! I went from the stock 1040MHz Core, to a huge 1200MHz. As for the RAM: the SAPPHIRE R9 290X Tri-X has some serious overclocking power! We were able to drive it up from 1200MHz all the way through to 1650MHz, completely stable in all of our tests bar one.
This is just completely incredible, as it provides us with an insane 422.4GB/sec of memory bandwidth. This is an all-important step of overclocking, and provides some much needed headroom when we're gaming and testing at an insane 3840x2160.
My family is enjoying some food and each other, I'm enjoying a nice alcoholic beverage, while I sit here in my shorts on a nice, sunny day here in Australia. What better time is it to test out some Radeon R9 290X GPUs?
I've played with the Radeon HD 7970s, and while they perform well, the new Radeon R9 290X GPUs are based on the Hawaii architecture, play much better with higher resolutions such as 3840x2160, or 4K, and no longer require a physical CrossFire connection.
We're still testing them on our Haswell setup, but we're soon moving to a Sandy Bridge-E rig, where we'll re-run the benchmarks we've already run, and are running today. This will be coming in the next month or so, we have nearly everything, we're just waiting on a few more parts.