Our last Tweakipedia entry was one of our most popular yet, where we pitted the budget AMD FX-8350 processor against the likes of Intel's Core i7-4770K and Core i7-4930K on both the LGA 1150 platform, and the high-end enthusiast LGA 2011 platform.
Well, after this article went out, I had a few people ask me to update my AMD hardware - but I didn't have anything lying around at the time (apart from the hardware used in the test). I reached out to my friends at GIGABYTE and AMD who were more than happy to provide some swanky new hardware to put to work.
It wasn't long ago that we tested our two Intel systems at 4K, comparing the LGA 2011 and LGA 1150 sockets against each other, but now we're going to do it all over again, but this time we're going to compare AMD against Intel's two sockets, at 4K.
During the first article, we found out that the CPU didn't matter too much, even when it came to gaming at 4K, but does it matter by taking things down a notch and testing it with an AMD setup? Let's take a look, shall we? First, we'll provide you with all of the specifications of our test beds, so you know what we're comparing.
Welcome to Part 2 of our new BitFenix Prodigy M build guide, if you're just coming into this now - you can go and check out Part 1, which was an Introduction of what we're planning to do with this slick new PC.
Now, onto the good stuff - we have a slew of videos showing off the inside of the Prodigy M, giving you a few tricks on how to build your PC inside of this case.
In the above video, we take a look at and around the BitFenix Prodigy M. We took out some of the fans and other parts of the case, something we give you an explanation of in the clip.
This has been a long time coming, but the final parts have arrived for our new BitFenix Prodigy M build. We're going to do this build in a multi-part series, which will better show off the system itself, and our building and testing of it. What better way to show off our new system, than taking a video of it through Google Glass? Here is our new system #throughglass.
The new system will be used in a variety of ways, with different benchmarks being run on it. We'll be doing some testing at 4K, and triple 1440p, too. We're hoping to get some more 4K displays, so that we can do some triple 4K, or 12K testing - that's when the fun will really begin.
We've tested our glorious setup on both LGA 1150 and LGA 2011 setups, where we've thrown SAPPHIRE Radeon R9 290X Tri-X GPUs in CrossFire through the high-resolution hoop at 4K. The same test was run on our LGA 1150, something you can check up on here.
Before we get into it, lets give you a rundown of the systems we're using here today.
LGA 1150 specs:
The last time we let the legs stretch on our SAPPHIRE Radeon R9 290X Tri-X GPUs in CrossFire was when we tested them at 4K. Well, that was when we had our old LGA 1150 setup, but now we have a flashy new, super high-end LGA 2011 setup and we're going to run them through their paces, again.
The system in question has been provided by many of our closest partners in SanDisk, SAPPHIRE, Corsair, InWin and ASUS. Without them, this couldn't of been possible. The specs:
I have been a busy, busy man in the last couple of months - my wife and I have just had a second, healthy baby girl, on top of the whirlwind of a two-year-old daughter we have now, and I've been busy reorganizing and building a new Tweakipedia test bed, on top of my daily news duties and everything else I do around the site, Facebook page, etc.
Now that I have some spare time, I'd like to introduce you to our new test bed here at Tweakipedia. We had some help from some of our closest friends, where without them, this would not be at all possible. The old test bed can be checked out here, but the new test bed is quite the beast.
And for good measure, we all know technology upgrades at an increasingly faster rate, so I will tease you that this test bed is only good until around June or so, where it will be upgraded yet again with an even more bleeding-edge set of components.
Starting with the heart and soul of the new Tweakipedia test bed, we have an Intel Core i7 4930K CPU - this is something I personally purchased, it's not a cherry-picked CPU at all as I wanted an off-the-shelf CPU versus something that scales better than consumer CPUs.
A couple of days ago we did a preview piece on AMD's Mantle technology, using our trusty Haswell-based system, and Star Swarm as our benchmark. Well, our system has been changed around a little - well, a lot - a fresh installation of Windows performed, and an updated version of Star Swarm.
This is why we used the word "Preview" in our article - right there on the headline. AMD's Mantle technology is a constantly-evolving piece of technology, that heavily relies on the software (games) side of things for it to work well.
An update of Star Swarm was pushed out between us writing the original piece, and now - so now we're up to date, and back with a better system - and an AMD system - to test it with.
Bitcoin mining is big business these days, with people able to 'mine' a digital currency in their own homes. Before Bitcoin had its big take off, mining Bitcoin at home was actually lucrative - if you mined your coins, kept them, and sold them at their peak.
These days, 24 hours of mining will net you 0.00001 coins, which is just cents per day. Keeping in mind that it costs you per day to run your machine, so electricity costs have to be thrown into the mix, too. We're not going to go into that right away, but what we are going to do is skim over it a little, and give you our thoughts on a few weeks of mining digital coins - an experience I'll never forget.
I'm not one that usually can't get something working on my PC, so it annoyed me that I couldn't get Bitcoins mined on my PC right away. I started off trying to use GUIMiner, but in the end it just wouldn't work for me, so I swapped it out for the Java-based BitMinter, which worked first shot.
We've already tested our SAPPHIRE Radeon R9 290X Tri-X GPUs in Crossfire, with some truly insane power consumption numbers thanks to there being virtually no limit on overclocking, voltage, or power consumption. We saw 1050W of total draw for our Hawaii-based dual-GPU setup, but what about NVIDIA's GPUs?
I only have a couple of NVIDIA GeForce GTX 780s here in my office, so I decided to throw them on my test bed and see what power consumption numbers are against our R9 290X GPUs in CF. The results, are very surprising to say the least.