TweakTown
Tech content trusted by users in North America and around the world
6,139 Reviews & Articles | 39,462 News Posts
Weekly Giveaway: Win an Antec Case, PSU and Cooler (Global Entry!)

Helping with tech questions - TweakTown's Ask the Experts - Page 8


TweakTown's Ask the Experts


Does using the 8x slot for SLI effect performance at all?

Question asked by Ovidiu from Romania | Answered by Anthony Garreffa | Motherboards Content | Posted: Jan 16, 2013 10:18 am

Hello,

 

I have a system with the following setup : AMD 955BE CPU, 4GB RAM, 850W PSU and an ASRock M3A785GXH/128M motherboard that has 3 PCIe slots.

 

One is 16x, the other is 8x and the last is 4x.

 

If I were to do a 2-way SLI setup with a GTX 660 (using the x16 and x8 slots), would the 8x slot impact performance?

 

For the moment I'm using a 1680x1050 monitor and in the future I am not planning on going any higher than 1920x1080.


Hi Ovidiu,

 

There are some benefits to using 16x slots on the motherboard for multi-GPU setups, but the difference is only a couple of percent. You might see a 2-3% increase in performance using 16x/16x, but it's not going to noticeable for multiple reasons.

 

TweakTown image asktheexperts/1/1/111_03_does_using_the_8x_slot_for_sli_effect_performance_at_all.jpg

 

First, 1680x1050 is going to be a huge limitation for any multi-GPU setup, and secondly, your CPU will be a limitation in any multi-GPU setup. I would go ahead with the 8x/8x SLI solution and not worry about it, you'll be fine.

Which GPU should I get on a $300 budget?

Question asked by Jordi from United States | Answered by Anthony Garreffa | Video Cards Content | Posted: Jan 15, 2013 5:59 am

I want to buy a new graphics card. I currently have a NVIDIA-powered Gigabyte card and I wanna know that when I get a new card(s), it's the best bang for my buck card(s) I can get. The most I plan to spend is $300. Also, at the moment, which graphics vendor has the most powerful or most recommended card out there, AMD or NVIDIA? Thank you for any help.


Hi Jordi,

 

At $300, you can get quite the powerful GPU these days - but when it comes down to which side you should jump into, AMD or NVIDIA - things can get messy. It all comes down to personal preference, and I could sit here writing 10 pages of arguments on either side.

 

TweakTown image asktheexperts/1/1/110_07_which_gpu_should_i_get_on_a_300_budget.jpg

 

I'm a fan of both, and both sides do things well. NVIDIA, in my opinion, have better game support - they seem to have more partnerships with game developers and have less "problems". These problems are usually solved through driver updates, and AMD keeps up quite well.

 

That's not to say AMD aren't a great choice either, as they are - and in some cases, their GPUs can far outperform the closest GeForce card in its price bracket which introduces an entire new argument again. So we'll skip that and just keep this prelude here, and get onto the recommendations.

 

On the NVIDIA side of things, you could get yourself a GeForce GTX 660 Ti for around $299.99 from Newegg. For this price, you'd be scoring one of the very swish EVGA SuperClocked versions.

 

If we're talking AMD, you could get the Gigabyte Radeon HD 7950 GPU which comes with a triple-fan cooling design that would not only be great for overclocking, but great for temps while gaming at extreme levels - AA, AF cranked, etc. This GPU is just $299.99.

 

I would recommend the Radeon HD 7950 out of those two options!

Which GPU should I get for mostly FPS and RTS gaming?

Question asked by Reuben from Singapore | Answered by Anthony Garreffa | Video Cards Content | Posted: Jan 10, 2013 7:14 am

Hi, I am just starting to learn how to build a PC but is stumped about choosing graphics cards! Living in Singapore, I only have a limited selection within my price range and would love to have expert's advice on which to get! I mainly play FPS/RTS, and action games like Assassin's Creed.

 

Thank you so much!

  • Palit GTX660Ti Jet
  • Palit GTX670
  • MSI N660Ti 2GBD5/OC PCI Express
  • Asus AS GTX660 Ti DCII/2G
  • Powercolor HD7870 2GB GDDR5 EYEFINITY 6 EDITION
  • MSI R7950 Twin Frozr (3GBDDR5)


Hi Reuben,

 

This is pretty easy, as you've got one GPU on that list that really stands out from the rest and that is MSI's Radeon HD 7950 Twin Frozr GPU. This card is a powerhouse of a card, and has an excellent cooler on it that will not only keep your GPU nice and cool, but it'll give you some incredible overclocking headroom.

 

TweakTown image asktheexperts/1/0/109_10_which_gpu_should_i_get_for_mostly_fps_and_rts_gaming.jpg

 

The other GPUs you have on your list are all great, but the HD 7950 will definitely stand out. Apart from that, you could get the Palit GeForce GTX 670 which is another fine GPU, but the Twin Frozr card from MSI still trumps this card.

Can you critique my upgrade and give me some suggestions?

Question asked by Israel from Philippines | Answered by Anthony Garreffa | Computer Systems Content | Posted: Jan 9, 2013 4:57 am

hello good day... i am the guy who asked about the processor thermal trip warning last year. recently my computer shuts itself after bootup. the worst part is now is, it wont switch on after i heard a loud bang. so i have decided to replace the core 2 quad Q6600 with a more better processor. can you critique my replacements in order to be sure before going to the computer store and buy them in a few days.

 

here are the list of parts that i will buy:

  • Intel core i-5 3450
  • Asus P8Z77-V PREMIUM
  • 8GB X 4 pcs 1333Mhz G-Skill Ripjaw RAM
  • CM Hyper 212 evo
  • arctic cooling MX-4 thermal paste

 

I am open for suggestions just in case...

 

Thank You Guys


Hi Israel,

 

That's not good that your PC died, but it does give you the excuse to upgrade to something much better! Everything you've chosen looks great - 32GB of RAM is a bit overkill, but RAM is cheap enough now to just buy as much as your motherboard can handle.

 

TweakTown image asktheexperts/1/0/108_10_can_you_critique_my_upgrade_and_give_me_some_suggestions.jpg

 

Those parts all look fine to me - you could maybe upgrade to the Core i5 3570K for some overclocking fun if you wanted to spend a few extra dollars. I would definitely suggest getting an SSD of some sort for your OS at least - the performance increase with an SSD is absolutely incredible.

 

If you've got any other questions, ask away!

I have a Z77-based motherboard and want to install 32GB of 2400MHz RAM, should I buy 2 x 16GB kits?

Question asked by Sinclair from United Kingdom (Great Britain) | Answered by Anthony Garreffa | RAM Content | Posted: Jan 8, 2013 3:10 am

I have a Z77 motherboard and ivy bridge i7 processor and would like to make use of it's full potential by fitting 32gb RAM. As the Z77 only uses Dual Channel RAM I am finding it very hard to find a set of 4 x 8GB 2400mhz dual channel RAM modules. Would it work it I bought 2 sets of 2 x 8gb 2400mhz Dual Channle kits and fitted them both?


Hi Sinclair,

 

Definitely - go for two 16GB kits of 2400MHz DDR3 RAM - that will be the best bet. Finding a 32GB kit would result in you buying a quad-channel kit - which would work - but would be more expensive than buying two 16GB kits.

 

TweakTown image asktheexperts/1/0/107_09_i_have_a_z77_based_motherboard_and_want_to_install_32gb_of_2400mhz_ram_should_i_buy_2_x_16gb_kits.jpg

 

You can get something like Corsair's Vengeance 16GB DDR3 2400Mhz (CMD16GX3M2A2400C10) for 198.29 pounds from Scan Computers.

 

Two of these kits would be some seriously slick RAM for any PC.

What PSU would be needed for two ASUS Radeon HD 7970 DirectCU II GPUs?

Question asked by Vincent from United States | Answered by Anthony Garreffa | Cases, Cooling & PSU Content | Posted: Jan 7, 2013 9:10 am

I have two ASUS 7970 DC-II cards, How large PSU is need for CrossFire?


Hi Vincent,

 

Each ASUS Radeon HD 7970 DirectCU II GPU is going to use, at a maximum, of around 300W each. So using two of them in CrossFire is going to use around 600W just for the GPUs - without taking into consideration the rest of your setup.

 

TweakTown image asktheexperts/1/0/106_08_what_psu_would_be_needed_for_two_asus_radeon_hd_7970_directcu_ii_gpus.jpg

 

I would suggest something like an 850W or even 1000W PSU to be safe. You won't need, or require that much power - but it's always better to have more than not enough. Any brand name PSU would do, but I'm a big fan of Corsair's range of PSUs, so you could go for something like the Corsair AX860, or the HX1050.

 

NewEgg sell the AX860 for $199.99 and the AX1200 will set you back $299.99.

 

If I had to make the decision for you - I'd go with the AX1200, only because I'd want to cover myself for future upgrades. That way, you are pretty much future-proof for any next-gen GPU action.

Should I get an Intel Core i5 3570K now, or wait for the Haswell-based chips to arrive?

Question asked by Bryan from United States | Answered by Anthony Garreffa | CPUs, Chipsets & SoCs Content | Posted: Jan 6, 2013 12:09 am

Hello to everybody.

 

I have a question about if buy an Intel core i5 3570K Ivy Bridge (with an ASRock Z77 Extreme6) or wait for an possbile Intel core i5 4570K Haswell (and an hypothetical Z87 motherboard).

 

I love the CPU overclock and performance but i don't know which processor buy.


Hi Bryan,

 

It would really depend on how patient you are and just how much work you're doing with the CPU. If you're a gamer, you're not going to notice a huge change in performance between the current-gen Core i5 and the next-gen Core i5.

 

TweakTown image asktheexperts/1/0/105_01_should_i_get_an_intel_core_i5_3570k_now_or_wait_for_the_haswell_based_chips_to_arrive.jpg

 

Yes, the Haswell-based processors will definitely be better, but that depends on how they're being used. CPU intensive applications will definitely show a performance increase, but it also comes down to price/performance ratios.

 

A Haswell upgrade might cost hundreds of dollars more than the current Z77 motherboard & Core i5 3570K processor would today. You'll have to account for that when making your decision. I would suggest getting the current hardware, and letting Haswell unveil itself and show just how much more performance is included.

 

The money saved could go toward an SSD if you don't have one already which would make a huge difference in overall performance.

Should I wait for the PlayStation 4, or grab a PS3 now?

Question asked by Saji from Indonesia | Answered by Anthony Garreffa | Gaming Content | Posted: Dec 31, 2012 3:58 am

Should I wait for the PS4 or buy a PS3 now?


Saji,

 

We should hopefully hear about the PlayStation 4 at E3 which is held in June next year. If Sony do announce the PS4 at E3, and release it a few months after E3, we're only 6-9 months away from next-gen consoles. If you haven't already bought a PS3, I would wait it out.

 

TweakTown image asktheexperts/1/0/104_07_should_i_wait_for_the_playstation_4_or_grab_a_ps3_now.png

 

That's my opinion - as it'll be the latest and greatest console from Sony. But, the PS3 has hundreds and hundreds of great games already and would be a fair bit cheaper than an off-the-shelf new next-gen console.

Since Intel's Core i7 is cherry-picked during the binning process, does that make them better than the Core i3 or i5 for general performance and gaming?

Question asked by Reece from United States | Answered by Anthony Garreffa | CPUs, Chipsets & SoCs Content | Posted: Dec 31, 2012 2:16 am

This is sort of a long question (well, short question with a lot of background info), so get ready for a read.

 

Most people say an i5 is exactly the same for gaming performance as an i7, because very few games can even utilize more than two cores, and no games currently need more than two cores to run, so the extra 4 threads are useless. That is true.

 

However, Intel's binning process involves selecting the badly deformed chips, and putting an i3 or lesser name on them. Cache is usually the deformed part. Slightly deformed chips are put into the i5 category, and near-perfect chips are given the i7 name. Since the i7's are binned higher, this means they've formed more perfectly. Doesn't logic follow that if they i7's have formed more perfectly, the cores can achieve higher clockspeeds? I know that usually the cache is deformed, so all an i7 guarantees is better cache, but there's also a strong chance, though no guarantee, of higher clockspeed tolerance.

 

For example, my i5 can only go up to 4.6GHz, no matter how much voltage I apply. However, most people's i7's can go up to at least 4.8GHz, if not 5GHz. I admit, I have seem i5's go up to 5GHz, and i7's stop at 4.5GHz, but I see way more 5GHz i7's than i5's. Since clocks can be pushed further, doesn't that mean single-core performance, and thus gaming performance increases with an i7?


Reece, first of all, thanks for a brilliant question.

 

Now let's get into it - two parted answer here. Intel's binning process is mostly about - as you said - making the chips that can handle the cache and clock speeds formed into the higher-end Core i7. The ones that don't handle the cache and are deformed, are pushed down the line and turned into a Core i5. The worst of which are turned into Core i3's.

 

TweakTown image asktheexperts/1/0/103_01_since_intel_s_core_i7_is_cherry_picked_during_the_binning_process_does_that_make_them_better_than_the_core_i3_or_i5_for_general_performance_and_gaming.jpg

 

Most Core i7's can handle huge overclocks, personally I have my Core i7 3770K sitting at 5GHz stable. But then you've got to bring in Hyper-Threading to the conversation, which I disable to give me slighter higher clock speeds, while using less voltage and giving more stability while running cooler. Hyper-Threading is great for those pushing core-intensive applications, but games? Nowhere near as much.

 

I think a 4.5GHz quad-core processor is all anyone would need for gaming, as CPU speed starts to decline in terms of how much performance improvement it will offer after the mid-4GHz mark. If you were running a 3- or 4-way GPU setup, with multi-monitor tech like Eyefinity or Surround Vision, then the CPU speed will help another bit - but not that much. It would be better to have voltages lower and have 4.5GHz than crank the voltages higher to achieve 5GHz.

 

Now, the second part - how does this help in games? Well, my Core i7 at 4GHz or 5GHz offers no visible change in frame rates or load times. These days, I have my CPU cranked up to 5GHz "because I can", not because it offers visible performance improvements. I just like to remove performance bottlenecks - so if it's capable of 5GHz, then it sits there. My old Core i7 860 used to sit at 4GHz and I see no difference between that first-gen chip and this third-gen chip with the extra 1000MHz.

 

But, the single-core performance (without HT) is better with a Core i7 overclocked than it would be with a Core i3. This all depends on how CPU-intensive the task is, as games only use up to four cores anyway. This will change next year when next-gen consoles arrive, which should hopefully sport more than four cores, which I'm predicting will have 6 cores with some HT-like technology.

 

The other benefit now is that Core i7's are not that much more than decent Core i5's... which makes the decision that much easier.

I just upgraded from my Dell 27-inch monitor to an LG 42-inch TV and experience lag - why is that?

Question asked by Jeff from United States | Answered by Anthony Garreffa | Displays & Projectors Content | Posted: Dec 28, 2012 5:46 am

I recently upgraded My Dell 27 Inch monitor for an LG 42 inch 1080p TV. This did not make my wife very happy but then again she doesn't Borderland 2. It seems to work fine with all the extra stuff turned off but there is some game lag.


Hi Jeff,

 

What you're experiencing is actually motion input lag, which is usually caused by the image processing technology in your TV. This can include the higher refresh rates, motion or edge smoothing and other features. This takes a few microseconds to happen, and adds "lag" to the game.

 

TweakTown image asktheexperts/1/0/102_06_i_just_upgraded_from_my_dell_27_inch_monitor_to_an_lg_42_inch_tv_and_experience_lag_why_is_that.jpg

 

So you might press W, A, S, or D and it will feel like it's taking longer for it to respond. When moving the mouse, it will feel like there's lag on the screen - but it's just lag on your input method. This is why controllers on consoles feel so much slower to operate in a first-person shooter than a 60/120Hz monitor on PC using a mouse.

 

The response time on an LCD monitor is much better than most TVs because of this - so to go from your Dell 27-inch monitor to the LG 42-inch HDTV, it's actually a step backward in quality (if quality is lag/response), but an upgrade in physical screen size.

 

This is why I personally use 120Hz-based monitors for my first-person shooter gaming. I have the latest Samsung 55-inch Smart TV, and while it's a lot better than most TVs out there in terms of input lag, it doesn't even begin to compare to a proper LCD monitor.

 

Update:

 

One of our great readers Justin, made a very good point about enabling a "Game" mode on the TV. Jeff replied saying that this, along with some other tweaks he made, made a huge improvement in his experience.

Latest Tech News Posts

View More News Posts

Forum Activity

View More Forum Posts

Press Releases

View More Press Releases