Helping with tech questions - TweakTown's Ask the Experts - Page 3
I currently have a GTX 680 and would like to upgrade to the new MSI GTX 980ti lightning edition or should I just wait until the 1000 series comes out?
That's a good question. The answer all comes down to your monitor resolution, refresh rate and what games you play. I'm assuming you have a semi decent processor and 8GB+ of RAM, too. If you have a 1080p 60Hz monitor, the GTX 980 Ti Lightning from MSI is going to be overkill, but it'll last until your next monitor upgrade, too.
If you're gaming on a 2560x1440 (1440p) monitor at 60Hz, then the GTX 980 Ti Lightning is going to be a huge upgrade. The same can be said with more emphasis for 4K gaming.
But the big question is, should you wait for the GTX 1000 series? Well, we don't know when that's coming, but I would be sure that we'll see something next March, and maybe something else in September - the usual NVIDIA release time fRAMes. As I said, it comes down to your monitor resolution and refresh rate. The new GTX 980 Ti Lightning from MSI is a bloody beautiful card, and you will not regret buying it, no matter what.
I wouldn't wait for the new GTX 1000 series if it were me, as you have an older card now. You could always buy the MSI GeForce GTX 980 Ti Lightning now, and then sell it when the GTX 1000 series comes out and you would still retain a nice chunk of the price you paid on the card.
If it were up to me, I'd upgrade to the MSI GeForce GTX 980 Ti Lightning and keep my ears to the ground here at TweakTown for any news on the GTX 1000 series.
Should I buy the ASUS Z170 Deluxe or GIGABYTE Z170X-Gaming G1 motherboard?
Hey there Ahmed,
To be totally truthful, either motherboard is going to be great. Both of them are based on the same chipset and will have virtually the same feature set (M.2 support for SSDs, multiple PCIe ports, and so much more).
Both of the motherboards are super enthusiast high-end boards that will let you overclock whichever processor you're going to use, and I presume you'll be using a Core i7-6700K, to its maximum potential. Both boards also look great, with similar styling - but if it was down to physical appearance, I think the GIGABYTE Z170X-Gaming G1 looks better.
It really comes down to your brand loyalty. I love both brands and have two high-end systems in my office built on a motherboard from both GIGABYTE and ASUS and I haven't had an issue with either. You can be confident in knowing that any choice you make you will love it.
Hi guys, I'm planning on doing a major upgrade. My crux is choice of gpu.... Do I get sli 970's or just a single 980? I want to be able to play max detail at 1080p with a view of upgrading to 1440p in the next two years.
Hey there Riyaad,
This is a great question, as it's very valid to countless gamers who are upgrading right now. Both options will be good for 1080p, but since you're not looking at upgrading to a 1440p monitor for at least a year, I would suggest grabbing a single video card for now. This will save you $400 or so, something that can be put into a single, much better card.
NVIDIA is set to launch their GeForce GTX 980 Ti very soon, so instead of grabbing the two GeForce GTX 970s, or a single GTX 980, I would put that money into a GTX 980 Ti. It will have 6GB of RAM which is more "future proof" than the GTX 980, and it will be much faster than a single GTX 970 or GTX 980.
If you didn't want to spend the money on a GTX 980 Ti (as we don't know the price yet), then an overclocked GTX 980 like the ASUS GeForce GTX 980 Matrix Platinum or MSI GeForce GTX 980 Gaming 4G LE would be perfect. 1080p and 1440p are nothing to those cards, as they handle them without pressure, at maxed out details, which is what you want.
hi i was thinking about upgrading my Graphics card right now. currently i have a 7850 and was considering putting another for crossfire. i was wondering is it worth it or should i get a single GPU. My desktop is connected to a single TV which is 55 inch.
This is a great question, especially with AMD's Radeon 300 series right around the corner. This is something you've got to remember, especially when you're looking at setting up a CrossFire system. If the Radeon 300 series wasn't imminent, then I'd have a different recommendation.
There are three ways to take this: buy another Radeon HD 7850, upgrade to something like the Radeon R9 290/290X, or wait for the R9 300 series. I think you're better off waiting for the Radeon R9 3x0 (something like the R9 380 or R9 390) and buying a single card.
Two of the HD 7850s in CrossFire will chew through some power (500W+ for the full system), as well as causing a lot of heat inside of your case. Then you've got to worry about performance and CrossFire scaling, which to be honest, is not that great. You'll always have better performance with a single card, as you'll get 95-100% performance from it.
Our Expert Recommendation: Wait a few weeks, as AMD will be unveiling its new Radeon 300 series. Possibly sell your card, and put the money you would've spent on a second card, on a mid-range R9 300 series card.
What is the downside of using four 16gb chips [in 4 slots]?
This is a good question, as there is no technical downside to having 4 x 4GB of RAM in a system. This comes down to the chipset, where if you were using Intel's latest X99 chipset, you'd have quad-channel RAM. So, using four sticks of RAM is what you want. But, if you had a dual-channel chipset, then using four sticks is also fine.
If your motherboard has four DIMMs max, which most do, and you're using a dual-channel capable chipset, then four sticks is fine. The only 'downside' to that would be your upgrade path, which would be maxed out at 16GB. You could use 2 x 8GB sticks, leaving two of them open to another 16GB, which would have your system enjoying 32GB of RAM.
So, to answer your question: no, there are no downsides - you're just limiting your system to 16GB of RAM versus 32GB. And unless you're doing heavy video editing or similar, 16GB of RAM is more than enough.
What are some good guidelines to go by when determining how much power you'll need? Is 750 watts enough for modern times or is now the time to upgrade?
Running an AMD FX 8350 eight core processor at 4GHz
16GB DDR3 2400 RAM
MSI R9 270X 2GB video card
Been thinking of upgrading the video card but I'm worried my old 750W PSU won't be able to handle it.
Hi there Frank,
Most of the time, the VGA card is going to suck up the most amount of power. I have a review coming up soon on the VisionTek Radeon R9 270X 2GB OC, which is nearly identical to the card you're running, and my total system load during a game of Battlefield 4 at 4K was drawing 230W.
For your system, you'd be drawing less than 350-400W even if you had multiple mechanical HDDs, high-speed fans, and even LEDs. I wouldn't worry about upgrading the PSU for its power output. However, if you had a non brand name PSU, maybe you could consider buying a branded PSU like Corsair, and going for an 850W which should ensure you're safe from upgrading for quite sometime.
As you can see from the chart above, even NVIDIA's GeForce GTX Titan X in SLI only consumes 530W of power. The SAPPHIRE Radeon R9 290X 8GB Tri-X cards in Crossfire chew much more, with 740W total. I would suggest waiting, and using your current PSU unless it's a non brand name - then maybe you could upgrade, but don't be in any rush. For now, you're fine.
Hey, a big follower of your website, I go through your reviews.
I am upgrading to the MSI Radeon R9 290X Lightning with a Cooler Master Silent Pro 850W PSU.
Will it run ok on this?
- Intel Core i5 2320
- ASROCK B75 Pro3-M
- 8GB DDR3 RAM
- Corsair Carbide Series SPEC-03
Hey there Rahul,
So sum it up easily: yes, your 850W PSU will be fine. Firstly, it's a Cooler Master unit so you know it's a good quality PSU. Second, 850W is more than enough for a single card.
We recently did some testing on the SAPPHIRE Radeon R9 290X 8GB Tri-X card, that would have similar power consumption to the MSI card you are upgrading to, and it didn't consume anywhere near 850W even when under full 100% load, and overclocked.
As you can see from the above shot from our article, you'd be looking at around 450-600W or so under full load. Your 850W will be just fine and I'm sure you're going to more than enjoy that kick-ass MSI Radeon R9 290X Lightning video card!
Is it a good idea to crossfire a R9 290 and R9 290x on a single display unit? If so, what amount of power are we talking about from the PSU here?
Hey there Sohaib,
This is a great question, and something that's easy to answer, but it really depends on the resolution of your display. If you are running a 1080p monitor, a single Radeon R9 290/290X would be enough. For a 1440p display at 60Hz, the same applies. But if it's a 120/144Hz monitor, then a second R9 290/290X would definitely be put to good use depending on the game.
If you're running a 4K monitor, then a second R9 290/290X would definitely be something I'd recommend for performance, but for power consumption and what AMD has coming soon, I would not recommend it. If you've already got a single R9 290/290X and were going to purchase another - I would keep your single card as it will be good enough for your display.
As for power consumption, a single R9 290X consumes around 450-500W of power. A second card can bump this up to ~800W or so, and with overclocking, you could easily reach 1000W or so for a pair of 290X's in Crossfire. Our last review looked at the SAPPHIRE Radeon R9 290X 8GB Tri-X, which consumed 450W in Battlefield 4, and while overclocked, a huge 625W.
AMD is close to releasing their Radeon R9 390X, which should provide much more performance, all while using similar, if not less power than a R9 290X right now. My advice, would be to stay where you are with the single card and wait it out, or upgrade to the R9 390X when it's released.
Can AMD and Intel processor do the same job, or there are some situations AMD is better than Intel vice versa?
Hi there Hosam,
This is a great question, where everyone is going to have a different answer. In a majority of situations, an Intel processor will beat an AMD processor. Video editing, gaming, virtually anything processor intensive, will have the Intel coming out of the flames holding the winning flag.
- If you were building a video or photo production/editing PC, then I would definitely recommend Intel.
- If you were building a new gaming PC, both options are great.
We've done some testing, and even at 4K the AMD CPU keeps up and actually beats a high-end, expensive Intel set up.
I would suggest getting what fits your budget, and if you can Intel. If you can't afford the Intel set up, an AMD set up definitely is not a bad decision, especially for the budget-minded gamer.
Hi there, I was wondering if 3 could be answered please? They are centered a bit about speculation.
1: I was wondering what your thoughts and ideas are about DX12 in terms of hardware that would need to support it and weather Intel and AMD will have to produce new hardware (besides GPU's) to support this new API?
Second: I was also wondering your thoughts on how much of Mantle would be used by Microsoft for DX12?
Lastly: Do you think NVIDIA will eventually build their own branded PC's in a few years time with their ARM Technology once that architecture gets good enough?
Sorry about the questions, I just like sticking my head into that kind of territory and taking off my tinfoil hat once in a while :)
My thoughts on DirectX 12, hey? We're seeing DX12 adopted pretty quick with the new GeForce GTX 900 series from NVIDIA having full support for it, and we're on the eve of the Radeon 300 series which will have support for DX12, too. We will need to see new hardware, but by the time it's here we should have plenty of new GPUs to choose from.
Intel and AMD will be right on it, as they are all part of the DX12 movement thanks to AMD, Intel and NVIDIA all having tight partnerships with Microsoft. On top of that, they will all want to push DX12 forward as it will really differentiate PCs away from consoles, and it'll sell boatloads of new hardware.
Mantle is done with now, as AMD has even said it wants to see developers to focus their attention on DX12. AMD released Mantle as a way of pushing developers towards getting "closer to the metal" or, achieving more access of the GPU itself. DX12 does that, as it removes multiple layers of the API which provides developers with even more access to the horsepower of GPUs.
As for the last bit, do I think NVIDIA could release their own branded PCs? Well, yes and no. We're seeing the company really expand its Shield line of devices, with the Shield Portable, Shield Tablet, and now the new Shield that is really just a kick-ass Tegra X1-powered console.
Secondly, NVIDIA just took the wraps off of the news that GRID will be rendering games in the cloud and pushing them down to gamers in select markets. Would they really need to release their own PC if they could render the game and shoot it down to a $199 console in the future at 4K? Or if you were to purchase a Shield VR headset in the future (which I think will happen this year), and have it rendered in the cloud and blasted down to your HMD?
Sure, NVIDIA could release their own branded PC, but I think we'll see that in the form of a Steam Machine or two, versus a full desktop PC as we know them now.