Helping with tech questions - TweakTown's Ask the Experts - Page 15
Can a 500w power supply run a single card GTX680?
This is a tricky question, as it would depend on the power supply unit (PSU) in question. If it were a generic, non-branded PSU, I would suggest to not hook the GEFORCE GTX 680 up to it. If it were a branded PSU by a company such as Corsair for example, then you should be fine.
If we look at our review of the reference GTX 680, it has a 3D Mark 11 load consumption of 456W. But, a generic 500W might not actually provide 500W of power, it might only provide 400, 420, or 440W of actual power. My advice would be to check the brand, and if its a relatively unknown brand - use caution.
If the 500W PSU you have is an unknown brand, or a cheaper model, I'm not saying it won't work, as it might. But, I would suggest going for either a lower-end card with less power requirements (such as a GTX 660 Ti). My better suggestion would be to pick up a Corsair-branded PSU, or similar, when you purchase the GTX 680 GPU.
I am currently running GTX 560 ti OC SLI with an I5-750 at 3.5ghz, is it time to update my processor or still hang on as i only game at 1920 and its all pretty smooth for now
The processor you still have, whilst it's not the current generation, is still more than powerful enough for the setup you have. You're playing at 1920x1080, which isn't really pushing the CPU to its limits on the current generation of games. Your GPU setup would most likely benefit from a faster CPU, but do you really need to purchase it? I wouldn't recommend to do so.
I would wait it out and spend that saved money (which wouldn't just be on the CPU, but motherboard, and RAM) and put it into some new video cards in the future. Next-gen games should launch sometime next year with the next-gen consoles, and will definitely require more grunt. But, even then, your CPU should still fare well.
I would suggest overclocking your CPU a little bit more (if you've got the know-how, and cooling) and save that money for some gruntier graphics cards next year. Again, everyone has a different opinion, but these days, CPUs aren't the driving force of power for gaming as they used to be. You've got a great CPU, I'd keep it for now. I hope that helps you!
So I am building my first PC, and I was planning on going for the i7 2600k, but with the new ivy bridge, a lot of my friends who have already built their PCs had advised me to go for the i7 3770k. I understand Ivy Bridge is updated and supposed to be better, yet I am also aware of the heat problems, especially when OCing (I don't plan on OCing so far, probably no higher than 4.0). So, if I have an aftermarket HSF, the Noctua NH-D14 specifically, will I still be experiencing heat problems? Should I just go for the Sandy Bridge?
Intel's Core i7 2600K is a great processor, and if the 3770K wasn't out, I would have no problems recommending it to you. But, the problem is that Intel have released the 3770K, and it is definitely the best bang for buck you can go for.
It does get a bit hot when overclocked, and under strain, but this is mostly when all 8 threads are being utilized at 100%, and for gaming and general use, this is only a tiny portion of its use. If you were using it for video editing, or something equally straining, then you might think twice. But, you've said you're only overclocking to 4GHz, and at 4GHz, those overheating issues don't really pop up.
If you were pushing it past 4.5GHz (up toward 5GHz) then yes, it would be a huge consideration to not go for the 3770K if you were worried of overheating. But, these days, getting an Intel-based CPU to get hot enough to run into problems, is hard, unless pushing it past some serious boundaries (not only speed, but voltage).
I definitely recommend you go for the Ivy Bridge-based 3770K.
Hello Mr. I am looking to buy a new gaming Pc i am keen on buying a Gtx 670 Or radeon Hd 7970. so my doubt is which Processor is Capable of extracting the gpus full potential. my cpu options are Intel Core i5 3550, 3570K And 3770K of the three which i go in for. Please Help......
Out of the three CPU's you've got to choose from, the Intel Core i7 3770K is definitely the best one, and will give you the most performance out of your GPU, should it be NVIDIA's GEFORCE GTX 670, or AMD's Radeon HD 7970. It's the highest end Core i7 in the LGA1155 socket, and would definitely pull some great performance from your GPU.
I have a MSI Radeon HD 6970 Lightning with a 3x1080p monitor setup, should I add another HD 6970 or grab a GTX 680 2GB or 4GB card?
I have a msi HD6970 lightning card with 3x1080p monitor setup..i want to update my pc should i get another hd6970 or buy GTX 680 2gb or4gb for my 3 monitor setup?
First off, this all depends on the games you play, and the frame rate and detail you like to maintain, and look at. I would suggest simply adding in another HD 6970 for some CrossFire action, as you'd have the power of two GPUs versus the since GEFORCE GTX 680. But, upgrading to a GTX 680, would give you a better upgrade path for the future.
It's a hard answer, to steer you toward your best choice. If you were planning to go toward a multi-GPU solution in the end, then I would suggest the GTX 680. If you were to buy a 4GB card now, you can always pick up another 4GB card in the future, and it would set you pretty straight for years to come on 3x1080p screens.
Either option is going to give you a great performing rig. One thing I would recommend is check your power supply unit (PSU) and make sure that if you do go down the route of another HD 6970, that you have a 650W brand-name PSU (like Corsair) minimum. If you have a 350-500W, continue on your single card, and maybe upgrade to the single GTX 680.
Hello-- I have two computers, HP sr1850nx and a1310. First uses PC3200 SDRAM (supports Dual channel DDR). The other uses DDR PC3200, DDR non-ECC. Are these memory sticks interchangeable? Thanks in advance.
Definitely. The memory between your two HP-branded PCs can be changed around. If you're looking at upgrading the systems, you can just buy some DDR400/PC3200 memory. I think it is more confusing as they are listed as "non-ECC", and "SDRAM", but both systems take the same DDR400/PC3200 standard.
As always, just be careful when installing memory, and make sure the pins line up when installing it. If you run into any problems, feel free to shoot me an e-mail, or submit another question.
I have an older ASUS P4C800-E Deluxe motherboard, and want to know what the yellow RCA port on the back does
Hey guys, I've got an older ASUS P4C-800-E Deluxe board, and I wanted to know what the yellow RCA port on the board's connectors is actually for. Is it an input or output, and how can I use it?
The yellow port on the back of your ASUS P4C800-E Deluxe motherboard is an S/PDIF port. It is used solely for digital audio. These ports are commonly found on motherboards, and act as a digital output port for audio. This means you can play digital audio such as Dolby Digital 5.1, or DTS tracks from a DVD for example through the S/PDIF port.
It will allow you to plug an appropriate coax cable into the port, and into an audio/video (AV) receiver. The AV receiver would then decode what it passed through the port (the audio track) and output it as intended from your source (DTS, DD5.1).
If you'd like to read up more on here, check out the Wikipedia page on S/PDIF.
Hi! I'm building my first computer and i was wondering if a MSI Radeon HD 7770 video card is compatible with an ASUS M5A97 motherboard.
Yes, an MSI Radeon HD 7770 will definitely work on your ASUS M5A97 motherboard. The ASUS M5A97 motherboard is actually CrossFire-compatible. This means you could put another HD 7770 into the motherboard in the future, and enjoy close to double the performance of the single HD 7770.
The ASUS M5A97 motherboard has PCI-Express x16 slots, which means you can put virtually any video card on the market into the motherboard. But being CrossFire-compatible, it gives you a great upgrade path going into the future.
I am building a new desktop pc wich will have a 2 way SLI setup on it, and was wandering about if it is worth getting a mobo that can run these two cards at x16 both or just x8 both. I am casual gamer and wanted to know if it will make much difference in gaming if the cards run at x16 or x8. The cards are at moment Msi's gtx 560 ti 2gb.
Thank you for your time, cheers!
This is another great question, Severus! I wouldn't really recommend specifically going out and buying a board that is capable of running multi-GPUs in x16 mode, as the performance benefits really aren't that great.
If we were talking PCIe 3.0 hardware, and some multi-monitor, multi-GPU gaming, maybe... but even then, we're talking less than 10-percent improvements across the board, most like 4- to 5-percent improvements. If you haven't already bought the GPUs (you say you're buying a new desktop PC) then I would save the money and get a board that doesn't cost that much (all boards have a single x16 port at least) and spend the money from the two GPUs on one single, faster GPU.
This way, you could get something like a ZOTAC GeForce GTX 670 for just $399.99 from Newegg. GeForce GTX 560's are still around $170 each, meaning a single GPU is only a bit more, but you do gain more performance from a single GPU, and less troubles with scaling and game compatibility, as well as not needing to worry about SLI for now.
Then, when you're ready for SLI, your motherboard will handle dual x8 ports, and you can throw another GTX 670 in it for some serious horse power.
Image is courtesy of Tech Support Guy, I thought it explained it well showing the bandwidth and all.
Do you have to have a 120Hz monitor (as opposed to a 60Hz) for 3D gaming?
Great question, Chris! This is something that NVIDIA and AMD don't really specify on their boxes, and can be a bit confusing when choosing not only a new GPU, but a new TV or monitor. First up, both companies use different forms of 3D technology, on NVIDIA's side we have 3D Vision, with AMD using HD3D technology.
In order to use NVIDIA's 3D Vision technology, yes, you will require a 120Hz monitor. This is because NVIDIA's 3D Vision uses an active shutter system, where each lens of the 3D Vision glasses operates at 60Hz, giving a better 3D experience, not just for the game itself, but your eyes. NVIDIA's 3D Vision also requires 3D Vision-compatible displays, rated at 120Hz.
AMD's HD3D on the other hand, is a much more open environment. They don't restrict 3D to specific "AMD HD3D-compatible" screens, which makes them a viable alternative to NVIDIA's 3D Vision. AMD just need 120Hz, but the glasses can be from the monitor manufacturer, and not specific AMD HD3D models.
There are some monitors that will reportedly work at 1080p but just 30Hz (60Hz total, screen limit) but this would result in a sub-par experience, not just for frame rate, but the 3D would most likely not look good at all. Our recommendation is to get a proper 120Hz screen. Even if you were to get a 3D Vision-compatible monitor, you're not locked to just NVIDIA's 3D, you can always use AMD's... it just gives you a monitor that has more life in it, and then you can upgrade in 1-2 years time and not have to worry.
I hope that's helped you!