The Contenders
Ethereum Mining Introduction
It's about time we clear up Ethereum mining performance, without judgment on the market or asking questions or debating whether ANYONE should get into Ethereum mining. This is a purely data and curiosity driven task, given that I have the resources thanks to my career to do so. Just to be clear, no one has prompted me to write this - with neither camp inside of AMD or NVIDIA asking me to.
A purely unbiased, quick, and dirty look at mining performance with stock drivers and GPU/VRAM clocks, to mining-specific drivers from AMD and overclocking the VRAM and dropping the power consumption - in some cases, DRAMATICALLY. I'm not going to go into the nitty gritty of how to do this, but if that's something that interests you - let me know in the comments below, and I'll work on an article covering that.
For now, let's take a look at the stack of cards that I'll be testing. This is the first wave of tests with the best-performing graphics cards that I have in my possession from both camps. The range shifts on the scale of the Radeon RX 570 to the new Radeon RX Vega 64 Liquid Cooled Edition from AMD, to the GTX 1070 through to GTX 1080 Ti, and even the TITAN Xp which sells for $1199.
The full list of cards being tested:
- AMD Radeon RX 480
- AMD Radeon RX 570
- AMD Radeon RX 580
- AMD Radeon R9 Fury X
- AMD Radeon RX Vega 56
- AMD Radeon RX Vega 64
- AMD Radeon RX Vega 64 LCE
- NVIDIA GeForce GTX 1060
- NVIDIA GeForce GTX 1070
- NVIDIA GeForce GTX 1080
- NVIDIA GeForce GTX 1080 Ti
- NVIDIA TITAN Xp
Let's get started on the next page!
Drivers + Claymore v9.8
New Blockchain Optimized Radeon Drivers
AMD might not like miners, at least in their public-facing comments, but the company released blockchain optimized drivers for its Radeon GPUs recently. These new drivers offer some great improvements to Ethereum mining performance, something I thought I'd test fully.
Make Sure To Use The Latest Claymore v9.8
Note: In the days leading up to getting this article up and onto the site, Claymore v10.0 was released - yet there isn't going to be a performance improvement with the new version. I will re-test the cards with v10.0 in the coming weeks, as I'm sure we will see new drivers released that could offer improved mining performance.
For all of my testing, I was using the latest Claymore v9.8 miner and running it mining Ethereum into my ETH wallet @ MyEtherWallet. I've got both stock and overclocked results, with some massive gains across the board - especially for AMD with their new Blockchain Optimized Radeon Drivers... hot damn does AMD kick ass with Vega and Ethereum mining.
Claymore v10.0: download here
AMD's new Radeon Software Crimson ReLive Edition Beta for Blockchain Compute drivers: download here
Mining Performance: AMD Wins
There Is A Clear Winner
This is what we're all here for, right? The battle in the Ethereum mining performance between NVIDIA and AMD is an epic one, as NVIDIA continues to dominate the gaming market with their top to bottom GeForce GTX 10 series.
AMD, on the other hand, really does rule Ethereum mining even in the older Hawaii and Fiji GPU architectures. Until the Radeon RX 400 and RX 500 series came into existence last year, the Radeon R9 Fury X was the champion miner for AMD.
In the Hawaii GPU architecture, AMD had the absolute #1 performing card for Ethereum mining: the Radeon R9 295X2 dual-GPU graphics card. The Radeon R9 295X2 is a freaking Ethereum mining beast.
But AMD's new Radeon RX Vega series graphics cards are even better thanks to their use of HBM2 memory and huge bandwidth numbers; Vega topples the best GPUs from NVIDIA.
AMD Radeon RX Vega 56
The new Radeon RX Vega 56 pumps out a respectable 29-31MH/s and stabilizing somewhere in the 30MH/s range, but what I haven't read online is that once the card gets hot (70C) then the ETH mining performance comes down dramatically. Once the card reaches 70C, it drops from a height of 31.4MH/s to 28.9MH/s... before it reaches the lows of 27.2MH/s, it is stable at 28.9MH/s at stock fan settings.
AMD ships the Radeon RX Vega 56 with its 8GB of HBM2 clocked at 800MHz, but I found I could overclock the HBM2 up to 955MHz with 24/7 mining performance at a peak of 36.9MH/s. This pushed the temperatures up to 73-74C again, so I increased the fan RPM speed to 3300RPM, and the temperatures landed back at 65-66C.
Power consumption wise, the Radeon RX Vega 56 without any tweaks sat at 300W. I decreased the power on the card with varying results, but landed on -5% power limit in Radeon Settings, which resulted in 62C maximum temperature on the GPU and 285W power consumption. 15W power savings, but an increase of 8MH/s is DAMN GOOD. This is rock solid stable 24/7.
Maximum OC with 24/7 operation on Radeon RX Vega 56:
- Temps: 62C
- Power: 300W
- Fans: 2800RPM
- ETH mining: 37.3MH/s
- HBM2: 955MHz
- Power: minus 18.5% @ 1269MHz
AMD Radeon RX Vega 64 + RX Vega 64 LCE
AMD Radeon RX Vega 64 Air-Cooled
The big difference for Ethereum mining performance between the Radeon RX Vega 56 and RX Vega 64 is the clock speed of the 8GB of HBM2, with AMD clocking it at 800MHz on the RX Vega 56 while the RX Vega 64's 8GB of HBM2 is clocked at 945MHz.
Even the air-cooled version of Radeon RX Vega 64 can have its HBM2 clocked up to 1100MHz, resulting in a huge 38MH/s at less than 380W of power. But because it's the air-cooled variant, anything above 70C and the HBM2 clock starts sagging down to 800MHz, resulting in radically reduced ETH mining performance. On the other hand, the RX Vega 64 LCE can have its fan speed tweaked to keep the temps under 60C without a problem, thanks to its massive water cooler.
Once the fans spin above 3000RPM, the RX Vega 64 air-cooled can sustain sub 70C temperatures and above 38MH/s, beating out the GeForce GTX 1080 Ti that it sits in the machine next to.
AMD Radeon RX Vega 64 Liquid Cooled Edition
This is where the fun really begins, but RX Vega 64 isn't as much fun as RX Vega 56 because it costs $100 more and is really limited by thermals. AMD has the 8GB of HBM2 clocked at 1000MHz on the RX Vega 64 and RX Vega 64 LCE, but both of my samples could be clocked to 1140MHz, without a problem, but anything over 1150MHz resulted in crashing.
Out of the box performance in ETH mining on the new AMD Radeon RX Vega 64 air-cooled variant reached 33.4MH/s, but after a few minutes, the temperatures spiked at 76C, which see the card coming back down to somewhere in the vicinity of 31.5MH/s and 34MH/s.
But this is where the fun began. AMD was a large part of my two 15-hour days of Ethereum mining performance testing on just four graphics cards. Once the Radeon RX Vega 64 air-cooled card reached 76C, the HBM2 clocks were fluctuating as the card was flooded with heat. I had a few options: cranking the fans up (which I did) and increasing the power limit (which I also did).
Once I increased the power limit up to 30%, the power consumption on the entire system reached a dizzying 400W, much higher than what I had been reading online from other tech media. Once the power limit was increased to 30%, the Ethereum mining performance reached an insane 42.7MH/s max, while it did drop to 39.5MH/s.
With the power limit increased to +20%, the power consumption reached the same 400W, while mining performance dropped to 39.2MH/s... but remember, I haven't overclocked the 8GB of HBM2 just yet.
NVIDIA TITAN Xp
The $1199 Beast: TITAN Xp
NVIDIA sent over two TITAN Xp graphics cards for some ongoing 8K benchmarking and gaming that I've been doing, which will be the article I write once this is finished - and boy, are they amazing cards. Just one of them is gaming and mining powerhouse, but I will have to put a note here: you would never, ever use the TITAN Xp in a mining situation... like, ever. Ever, ever, ever. OK?
This is purely a test between NVIDIA and AMD silicon and technology, with this being the very best that NVIDIA has in the consumer space. TITAN Xp is an expensive card coming in at $1199 while the Radeon RX Vega 56 is $399, RX Vega 64 air is $499, and the RX Vega 64 LCE is $599. TITAN Xp is double the price of the RX Vega 64 LCE but has nowhere near double the performance.
Under my testing at stock clocks, the TITAN Xp reached 62C comfortably with its fan spinning at around 1750RPM with 200W of power consumption (this is the most impressive part), while Ethereum mining performance was at 35MH/s, not bad for our first attempt.
I then played with a bunch of settings between reducing and then increasing the power limit on the TITAN Xp. All the way up at +120% power the card was consuming a much larger 300W of power, and our mining performance went to an insane 42MH/s. I was able to settle down to power being at -85% and seeing ETH mining performance at 41.3MH/s while the power consumption was lower at 265W.
NVIDIA has slapped on 12GB of GDDR5X on the TITAN Xp which is put to really good use in high-res gaming, oh... and Ethereum mining. I was able to overclock the 12Gbps GDDR5X up by 800MHz, for a total of 300W power consumption total, but 43MH/s of mining performance.
Get this; the memory bandwidth numbers are off the chart here with a completely out of this world 614GB/sec of memory bandwidth thanks to the wide 384-bit memory bus on TITAN Xp.
Benchmarks + Final Thoughts
Ethereum Mining Benchmarks
This is a full list of graphics cards that we've tested so far, apart from the workstation-grade NVIDIA Quadro P6000 that I've recently received. This is a $4000+ graphics card that is for rendering workstations, data scientists, and very specific customers of NVIDIA - who aren't gaming. Even the $4000+ card only hits 30MH/s of ETH mining performance.
Without further delay, our newly updated Ethereum mining benchmarks:
Not bad, right? Just wait until you see the next chart with the overclocking scores in it. I've separated these, so if you want to share the images around you can show people the difference between stock performance and our OC results.
Final Thoughts
As you can see, NVIDIA takes the cake - but it requires the TITAN Xp, which costs $1199. If we take that out of the line up (which we should, as it's not a GeForce card), then AMD absolutely dominates.
AMD's new Vega GPU architecture really shines in the compute department, with Ethereum mining performance showing that AMD can not just compete with NVIDIA, but they can easily beat them when something is optimized to work on their GPU/VRAM architecture and technology. This is proven by the Radeon R9 Fury X still beating out the GeForce GTX 1070, and even the GTX 1080.
NVIDIA's fastest GeForce GTX 1080 Ti graphics card is the only one that can begin to compete against Vega, with the GTX 1080 Ti pushing out RX Vega 56... but not RX Vega 64. Then when we pit the liquid-cooled version of Radeon RX Vega 64, it completely dominates. We haven't even gotten to the overclocking results, which were absolutely amazing on Vega.
AMD's new Radeon RX Vega 64 Liquid Cooled Edition smokes everything NVIDIA can throw at Ethereum mining, including their top-of-the-line TITAN Xp which costs $1199. You could buy two RX Vega 64 LCE cards (if you can find them, and both at $599) and have 84MH/s or more of ETH mining performance for the same price as what NVIDIA sells a single TITAN Xp for. The single TITAN Xp is only going to push 42MH/s or so when pushed to its absolute limits.
All-in-all, AMD dominates Ethereum mining - and continues this trend heavily into Vega. Now if we could just get that same performance gain over NVIDIA in games, without requiring another generation of cards. Things could get tricky with RTG's Chief Architect and SVP, Raja Koduri, taking a sabbatical from RTG through to December. Until then, we can only applaud AMD for the work they've done with Vega to see outright dominance in Ethereum mining here.
42MH/s from a single RX Vega 64 LCE! Insanity!