Technology content trusted in North America and globally since 1999
8,460 Reviews & Articles | 65,098 News Posts

Helping with tech questions - TweakTown's Ask the Experts - Page 1

Ask us a tech question!

Use the form below to submit a question for Ask the Experts.

What is the best video card upgrade for my compact ThinkCentre PC?

Question by Tam from Philippines | Answered by Tyler Bernath | Video Cards | Posted: 12 hours, 10 mins ago

What is the best Graphics Card, and I mean the best "supped-up" "melt-my-face" Graphics Card I can slap into my Compact Case, the Lenovo ThinkCentre M90P?

Right now, it has an AFOX 2GB GT730 DDR3 128bit Low-Profile Graphics Card and it appears to be worn out and causing some weird hieroglyphic-like glitches to appear on my screen from time to time. So, a replacement should be in line. I just want to know, is the current Graphics Card I have the best my PC can handle? Or can it take something even better, or even the best? Please help GPU experts. I can't even play PUBG without Lego-like glitches falling all over my screen.

Hi Tam,

I would start by checking your HDMI or Displayport cable to see if that's the issue with the graphics corruption you are seeing. If you are set on upgrading your GPU, the ThinkCentre machine you mentioned uses half-height graphics cards.


Doing some looking online, it does appear you can get some decent GPUs that are half-height with no external power requirement the best being the GeForce GTX 1050 Ti. ASUS, GIGABYTE, and MSI have options there.

A second, slightly less powerful option would be the Radeon RX560 with MSI being the only vendor I have seen with this option.

Bent CPU pins on AMD Ryzen 2600 after removing, what did I do wrong?

Question by Sammy from United States | Answered by Tyler Bernath | Cases, Cooling & PSU | Posted: 4 days, 11 hours ago

I just finished replacing my CPU going from an AMD Ryzen 2600 to a Ryzen 3800X. I've always known AMD platforms to be a pain when removing the CPU, normally resulting in bent CPU pins.

So, I decided to try the twist method. To my luck, I was twisting and pulling and a few seconds later the CPU popped out with two rows of bent pins! What did I do wrong, I was using AS5?

Hi Sammy,

You are correct - AMD systems have always had issues with bent pins when removing heat sinks, but it really is user error coupled with thermal paste.


The best way I have found to remove heat sink on any AMD platform including AM4, is to remove all mounting hardware and fans and twist the cooler while applying slight pressure towards the socket.

This should help in breaking the adhesive stiction of the thermal paste and in the future I would recommend a paste that's a little thinner like NT-H1 or MX-4 and not so peanut buttery like AS5.

Budget gaming monitor selection, BenQ GW2283 or LG 22mk430h?

Question by Lee from Malaysia | Answered by Tyler Bernath | Displays & Projectors | Posted: 1 week, 1 day ago


I'm looking for a gaming monitor under the price of $85 and have found two which I think are good but I'm unsure how to compare.

The first model is the BenQ GW2283 and the second model is the LG 22mk430h. Please do help me with this I usually play competitive games like Rainbow Six Siege and been wanting to look for one that's good. Note - I'm using a gaming laptop right now to play games (ROG GL503GE)

Hi Lee,

No problem!

I'm guessing you are wanting to add this as a secondary main screen paired to the ASUS notebook you mentioned above. If so, it would be beneficial to know what setup the notebook has; defaults for your model appear to be a Core i7-8750H CPU and GeForce GTX 1050 Ti 4GB GPU.


Both monitors are 1080p native with 5ms GTG response time and even share much of the same specs all the way down the list. The BenQ does have two HDMI inputs whereas the LG has one HDMI and a VGA.

For gaming, you are going to want to look at refresh, response and resolution. The LG, for me, looks like the better solution simply because it supports FreeSync at 75Hz. You don't necessarily need an AMD GPU to take advantage either, you will just manually select 75Hz in setup.

Having trouble getting four DDR4 DIMMs running in XMP on X570

Question by Etutam from Spain | Answered by Tyler Bernath | RAM | Posted: 1 week, 5 days ago

I just bought the new X570 Aorus Elite mobo with Ryzen 2700. I had 2 kits of RAM 2x4GB 3200Mhz and others 2x4GB 3200Mhz, both are Corsair Vengeance LPX but bought them separately. If I enable profile XMP in BIOS I get random blue screens. I put timings manually (16-18-18-18-54) and voltage is set to 1.35. Still having same problems, or games random crash too. I would like to get any tip to get them more stable, or should I buy new kits?

Hi Etutam,

I think we can help with this. Four stick RAM stability has always been a little harder to get even going back to older AMD and Intel platforms. First things first, are the kits exact? You mention they are both Vengeance LPX with the same speed, but Corsair releases different versions with each component change they make. I would check that all four sticks are the same model and version before pushing further for XMP.


With your current configuration, you can try to increase memory and SoC voltage to see if it helps. Don't go too far on voltages. I would take DRAM to 1.45 max and SoC to 1.1v at the highest. You can also try to relax timings even more to get the two kits to run at 3200MHz together and gradually lower timings one at a time checking for stability.

If you are one that doesn't want to deal with or have the time to play with memory settings, the easy answer is to sell what you currently have and get a Ryzen tested kit of memory.

Corsair SP120 fan RGB is not working in my Obsidian 500D SE chassis

Question by James from Australia | Answered by Tyler Bernath | Cases, Cooling & PSU | Posted: Oct 3, 2019 @ 23:22 CDT


I currently have a Corsair Obsidian 500D SE chassis outfitted with the included 3x LL120 RGB fans in the front and want to add a fan in the rear for exhaust. I purchased an SP120 to do this, but after installing, the fan spins but the RGB doesn't work, what gives?

Hi James,

You have come to the right place! In doing the research for this, it appears the SP120 and LL120 use different RGB controllers. The SP120 uses a "Core" controller, while the LL120 has a smoother RGB effect and uses the "Pro" controller. You cannot mix and match RGB fans on the same hub in the Obsidian 500D SE, and even if you could, the lighting effects would not be the same as the SP120 is not as capable in terms of RGB effects as the LL120.


To remedy this, you have two options. The first is to purchase a new LL120 to replace the SP120 so you can use the same RGB hub (and lighting effects).

The second is to purchase a secondary hub with SP120 fans. This option will give you an option for expansion later, but the LED effect of the SP120 don't match the LL120, so it may give an undesired look to your build.

Can I upgrade my ASUS PC with more RAM and GeForce RTX 2080 SUPER?

Question by Nathan from United States | Answered by Tyler Bernath | Video Cards | Posted: Sep 28, 2019 @ 14:30 CDT


I have a G20CB prebuilt from ASUS and i've been wanting to see what I can do to upgrade it. I'm looking at changing out the GPU and RAM as I currently have a single stick of memory that's 16GB and a GTX 1080. I would like to upgrade the memory to at least 16GB dual channel and possibly the GeForce 2080 Super so I can hit 60FPS consistently at 1440p. Is my PC compatible with a 2080 Super and if so, do I have to buy an ASUS GPU? Also, what kind of RAM is compatible with my system?


Hi Nathan,

I think we can help here! Starting with the memory situation, your best bet is to open up the side of the chassis and check what is currently installed. Make sure you ground yourself to the chassis first to remove static and pull the stick and write down the model number.


Then we can go searching for a matching stick that will get you 32GB of dual channel for the cost of a 16GB kit.

For the GPU, first you do not have to match brands i.e ASUS but I really can't recommend the NVIDIA GeForce RTX 2080 SUPER until I know your power supply can handle it. The fact is, the RTX 2080 SUPER on average uses 80W more at load than the GeForce GTX 1080.

So, while you have your system open checking model numbers for your memory see if you can find the power rating for your PSU, there should be a label on the side of the unit. If its at least 650-700W, you should be fine upgrading your GPU, but it always worth investing in a new PSU for the extra headroom.

Should I go with the AMD Ryzen 3700X or 2700X? Is it worth the price?

Question by Dean from United States | Answered by Tyler Bernath | CPUs, Chipsets & SoCs | Posted: Sep 25, 2019 @ 0:54 CDT


I'm building a new machine and decided on the AMD Ryzen 3700X. However, I just looked and the 2700X is now $240 vs. the 3700X at $330.

I know the 2700X is enough for my computing needs however I wonder if getting the 3700X will set me up better for the future since I do some video editing work. Is it worth it with the almost $100 price difference?

Hi Dean,

Thanks for reaching out! I have a few additional questions; will you be budgeting for an X570 motherboard to take advantage of the benefits the 3700X can offer? Is this a budget build where you will want to cut corners and where will you cut costs?


If you are going all out for the highest performance possible then sure, the 3700X is what you want, but keep in mind many are already saying PCIe 4.0 will be short lived so it may be wise to hold off and wait for the next generation.

If you are dead set on this new system, then let's compare these two chips. The 2700X is an 8 Core / 16 Thread with 3.7GHz base clock boosts on average to 4.3GHz. The 3700X on the other hand is an 8 Core / 16 Thread with 3.6GHz base clock but boosts to 4.4GHz on average which offers a 14% speed advantage. The 3700X also has the advantage with TDP at 65W whereas the 2700X is 105W.

Can the MSI B450M Mortar motherboard support 32GB RAM at 2666MHz?

Question by David from Malaysia | Answered by Tyler Bernath | RAM | Posted: Sep 20, 2019 @ 13:31 CDT

Hello ATE,

Can the MSI B450M Mortar motherboard support four sticks of 8GB of DDR4-2666 memory (for a total 32GB) when paired with a Ryzen 5 3600 CPU?

Hi David,

Thanks for the question. If we head on over to MSI's website and lookup the motherboard mentioned above and look through the specifications, you will find it supports 1866 through 2666 by JEDEC and up to 3466 by overclocking so you should have no issue with running your 2666MHz memory.


As far as capacity, things may get a bit tricky here, but according to the motherboard specifications, it does have four slots and supports a maximum of 64GB. This would allude to being able to support the 8GB per slot you want to run and even up to 16GB per slot if you ever wanted to upgrade.

That being said, for greatest compatibility with four sticks, I would suggest referring to the supported memory list if available as that will give you specific sets of memory that MSI has tested with the board. In addition, you will need to run a BIOS later than 7B89v18 for Ryzen 5 3600 to be supported and it does look like MSI has been improving memory compatibility as of late.

Why isn't my new secondary drive SSD being detected in Windows 10?

Question by John from United States | Answered by Tyler Bernath | Storage | Posted: Sep 16, 2019 @ 13:22 CDT

Last week I built my first PC with a GIGABYTE X570 Aorus motherboard. I was able to get the machine to boot and install Windows 10 on my Corsair MP600 along with all the drivers available, but I still have an issue; my Crucial MX500 SSD isn't being detected, it does show in BIOS though.

What am I missing?

Hi John,

Thanks for reaching out! We can certainly help with this issue. First, I'm only guessing, but I will assume this drive is new, fresh out of the box? If so, the likely culprit is the drive hasn't been initialized.

To initialize the drive, we will need to boot into Windows 10, right click on the Start Menu and go to Disk Management. From here, you will likely see a box pop-up asking you to initialize your drive with an MBR or GPT Partition Table.


Now, for secondary storage, it doesn't really matter which you use, but there are caveats. MBR can't handle drives over 2TB and it also has a limit on how many partitions you can create. GPT doesn't have these restrictions, so I myself default to it.

Once you choose which partition type you will use, we will then create a volume by right click on the drive and selecting new simple volume and a window will open. In this window, you will hit next several times until you see finish. Job complete, your drive should now be usable.

Do I need to buy matching M.2 SSD drives for use on the same board?

Question by Gavin from United Kingdom (Great Britain) | Answered by Tyler Bernath | Storage | Posted: Sep 12, 2019 @ 11:05 CDT

I'm currently running an ASUS Prime Z390 with a 1TB Samsung Evo Plus in the top M.2 slot.

My question is if I would like to add an additional M.2 drive, should it be the same make, model, and size?

Hi Gavin,

Thanks for the question! As you may or may not know, DRAM has additional settings that affect its performance like timings and voltage.

When you mix different speeds of DRAM, JEDEC settings will allow the sticks to work together in most cases with a default timing and voltage setup with the downside being you won't be able to run XMP or DOCP. This is why many will recommend you buy your memory in "kit" form.


With M.2 SSDs, you don't necessarily need to worry too much about matching drives, unless you are planning to setup a glorious RAID array with your EVO+. I would, however, stick to known brands with a solid warranty like the Samsung, Western Digital, and SanDisk solutions have.

One thing you do need to pay attention to is your motherboard's M.2 configuration and in your case, your bottom M.2 slot comes off the Z390 chipset and supports both SATA and NVMe M.2 drives.