Update: I've added 2 more systems, and 5 more graphics cards for a total of 7 systems and 16 graphics cards. This has provided me with an estimated $1339 per month at this rate, and I'm now securing more hardware and moving into actual mining rigs so I can space out the graphics cards so they don't die in a few months :)
Disclaimer: This is something you should think on before you jump into, as the entry hardware cost is high and Ethereum pricing is volatile. It's not a guaranteed thing at all, and this is not a sponsored article. I used all of the hardware that I have lying around, and I'm doing this 'because I can'. I'm using around $10,000 worth of hardware (4 x separate PCs) but this could be done much cheaper, and is something I'll be going into in a future article.
This is just a tease, but I've been testing Ethereum out on my PCs and was instantly addicted. Any of you who know me personally or can see what kind of thing I'm into both personally and professionally, is that I'm addicted to hardware - especially graphics cards.
It was only a few days ago that our MSI GeForce GTX 1080 Ti Gaming X 11G turned up, and we were pretty blown away with the results - especially at 4K. Well, today is a new day, and it brings us another custom GeForce GTX 1080 Ti graphics card, this time from AORUS.
The new AORUS GeForce GTX 1080 Ti 11G is now in the GPU testing labs here at TweakTown, and while we've only just thrown it into benchmarking mayhem, I thought I'd share some photos of the card.
As you can see, AORUS puts some good work into the packaging for the card - with the backlighting up with details on the GTX 1080 Ti 11G.
But it's the actual card we're all here for, right? The triple-fan WindForce cooling system looks GREAT, but it is absolutely huge.
Over the last few days I've been testing MSI's new GeForce GTX 1080 Ti Gaming X 11G graphics card, the first custom GTX 1080 Ti that I've received, and man is it a monster.
Before our full review of the card (which is very close now), I thought I'd write up a quick overview of the card - so we can see the 4K performance for now, and the rest of the review in much more detail soon. The card looks awesome, with a revised Twin Frozr VI cooler and a 2.5-slot design that houses a thicker heat sink and cooling system.
Performance wise, MSI has a monster in its hands with the GTX 1080 Ti Gaming X 11G. Just check out these results:
We're doing some new benchmarks here in the GPU labs, and now we have finally got some results to share - after weeks of setting up new procedures, and banking some data.
Battlefield 1 is one of the best-looking games on the market, so we've tested two graphics cards for a new GPU showdown. I've tested the HBM1-based AMD Radeon R9 Fury X, against NVIDIA's newer GeForce GTX 1070. This gives us a good look at the best from AMD in their previous generation, while we have the third-best current-gen Pascal-based card from NVIDIA.
Now that Battlefield 1 is here, we've been testing the game on various hardware and will be writing various articles on the performance of the game between graphics cards, resolutions, and in-game graphics settings. One of those is DirectX 12, where AMD has claimed it has better performance than NVIDIA - and they really do.
We did all of our testing on the drivers available at the time, but now new Battlefield 1 ready drivers have landed (or are about to land) from both AMD and NVIDIA. We will re-run the tests to see how much performance improves on the new drivers, but for now, we used the RSCE 16.10.1 drivers, and NVIDIA's current GeForce 372.20 WHQL drivers.
For the purposes of this benchmark, we ran Battlefield 1 at 1920x1080 and used the 'Medium' and 'Ultra' presets with AA disabled. We then ran Battlefield 1 in DX11, and again in DX12, recording the minimum and average FPS results. We used the AMD Radeon RX 480 8GB reference alongside the NVIDIA GeForce GTX 1060 6GB Founders Edition, both reference/FE cards with stock clocks and no additional overclocking used.
I've been spending the last week or so benchmarking my life away, cranking away with The Coalition's recently released Gears of War 4 on the PC - with my last article taking a look at running the game at 8K (7680x4320), on various graphics cards, including the Radeon RX 480 from AMD.
After I was finished benchmarking Gears of War 4 at 8K, I moved down to the normal resolutions like 1920x1080, 2560x1440, and 3840x2160 and benchmarked more cards. The game is absolutely gorgeous, so I kept everything on the highest preset and ran the game again at 1080p, 1440p and 4K on the following graphics cards:
Gears of War 4 is one of the many new game releases of October, riding on the gravitational marketing waves of Battlefield 1, Mafia III (which is a massive mess on the PC - no surprise there), Call of Duty: Infinite Warfare (which requires up to 130GB of HDD space), and countless other games - heck, even Star Citizen is shaping up incredibly well, and could be the best PC game ever made.
Well, what better way to celebrate the release of Gears of War 4 from developers The Coalition, then by running the DX12-powered game at the glorious 8K resolution. To refresh your memory, 8K is a mammoth resolution that renders at 7680x2160 - 4x the pixels of 4K, and 16x the pixels rendered than 1080p. It's strenuous, and I love it - I'm addicted to higher resolutions and refresh rates, which is why we're here today running Gears of War 4 in DX12 on Windows 10, at an insane 8K.
I've also tested Gears of War 4 with Asynchronous Compute both on and off, so we can see if it's adding to the performance, or not. As for the in-game graphics settings, I used a custom set of visual settings, with everything on either Ultra or High.
Just how crazy is 8K? Let's check:
Ubisoft has had its biggest launch ever with The Division, with it selling more copies in its first 24 hours than any other Ubisoft game, ever. Well, thankfully The Division has a built-in benchmark that I've spent the last 4-5 hours inside of, testing it out on my two fastest GPUs - AMD's Radeon R9 Fury X and NVIDIA's GeForce GTX Titan X.
I'm testing out The Division on our VGA testbed for now, but I will be doing some deeper comparisons in the coming weeks between the Intel Core i7-6700K, Core i7-5960X and AMD's FX-8350 processor. We'll do some more testing very soon (hopefully over the weekend) at 11,520 x 2160 (triple 4K) which should stress out all of our video cards, and bring them to their knees.
It was sometime in 2009 that I plunged into the NAS world when I purchased my QNAP TS-639 PRO. At the time, it was a beast. I remember spending around $5000 buying the NAS and 4 x 1TB (in RAID5) and 2 x 1.5TB drives at the time. The QNAP TS-639 PRO was a 6-bay, 1GbE NAS that took 6 x SATA 3Gbps drives, and had various functions and features you could use for years to come.
It was expensive, but it was oh-so-worth it. Before that, I was using a small Netbook for my NAS-like storage, with external USB drives and their respective cables flying out of it like an octopus on drugs. It was messy, but it worked. After securing the QNAP TS-639 PRO and taking weeks to get it set up how I wanted, it was worth the investment.
The trusty QNAP TS-639 PRO worked without a hitch, turned on 24/7 and used extensively right up into 2015, then one of my 1.5TB drives died. After that, the NAS started slowing down, and towards the end of 2015, I was getting around 100-500KB/sec on network transfers, when it should've been more like the 80-100MB/sec that I was used to.
Futuremark has been around for what feels like forever, with the Finnish company being one of the first on the market with a mainstream benchmark that stressed your PC. Everyone remembers 3DMark and the iconic Matrix-like tests from 3DMark 2001... well, they have come a fair way since then!
Fast forward to today, and we have Futuremark on the precipice of releasing a DX12-based 3DMark in the New Year, as well as VRMark. Futuremark recently released the VRMark Preview with a new UI into 3DMark as part of a holiday beta. The 2016 version of 3DMark will include a benchmark for VR headsets like the Oculus Rift and HTC Vive, so the VRMark Preview is great to see out before the end of the year.