GeForce 6200 - IntroductionIntroduction
nVidia and ATI for the past two years have been in a heated battle for the crown of the ultimate graphics card.ATI certainly had this wrapped up for nearly a year with the Radeon 9xxx series VPU, as nVidia made what we like to think of as too big a jump when it released the FX series GPU. This started nVidia off on the wrong track when the GeForce FX 5800 Ultra came onto the market. Its sub-standard performance and vacuum sounding heatsink certainly turned users away from picking up one for their system. The NV30 core just didn't do it for nVidia, though some improvements were made to reduce noise, heat and increase performance but the simple fact was that the NV3x cores had a problem keeping up with the ATI Radeon based graphics cards.nVidia has now evolved on the FX series of GPU, removing the FX name completely, a new name for a new line. There are currently three different products in the GeForce 6 series, each having one or two different core variations. The high-end is comprised of the GeForce 6800 GPU with the 6800 Ultra at the top, followed by the 6800GT, 6800 and 6800LE. The mid-level consists of the 6600 line which contains the 6600GT at the top of the list, followed by the 6600. In the entry level the latest GPU to come is the 6200, with only the 6200 core itself in this class, with a possible LE or GT version to come in the future.Today we are taking a look at the GeForce 6200 from nVidia to see what this can bring to the table to remove the PCX5900 from the top of the entry level PC platform and also how it compares against the ATI Radeon X300.
GeForce 6200 - 6200 in DetailnVidia GeForce 6200
The nVidia GeForce 6 series started out with the original 6800 series, designed to take on the enthusiast segment. nVidia announced shortly after the NV40 GPU made it to market that the GeForce 6 series would be available in value and mid range segments. The GeForce 6200 GPU uses the GPU labelled as the NV43. It is a new generation chip designed around the NV4x core.
In the table below you can see the differences between each of the different cards:
nVidia has chosen TSMC's 0.11 micron process in order to produce the NV43 core. This process has been used by nVidia since the introduction of the GeForce FX 5900 series of cards and will continue for quite some time as its reliability is now one of the best for the production of GPU dies, as well as its heat dissipation capacity. When it comes to the die dimensions, it measures 120mm by 130mm making it the exact same size as the other NV4x cores available. nVidia hasn't shrunk the size at all, this is surprising as the reduced pipelines and vertex engines would allow a more compact die size.When it comes to the two versions of the 6200 series, there will only be two small changes. The 6200 will be the top performer and the 6200LE will come out sometime later on. This card will offer the same size memory, core and memory clocks; however, the memory bus will be reduced from 128-bit to 64-bit. This will seriously cripple the card for gaming, making sure that if you want high quality budget card, the 6200 core will need to be your bet. On the high performance gaming note, you will notice on the cards there are no SLI ports on the 6200 series. nVidia has elected to skip SLI support for the 6200 series, leaving this to the more expensive 6600GT and above to fill this void.When it comes down to it, the 6200 is designed to take on ATI in its X300 series VPU role as the PCI Express budget card, which brings us to the bus interface. nVidia has no plans at this point to introduce the 6200 as an AGP component, however, it isn't stopping the vendors like Gigabyte, ASUS, ABIT or the rest of the nVidia partners from adding the HSI bridge to a card and producing an AGP compatible 6200. nVidia's plan for the AGP bus in the budget range is to leave it to the GeForce FX range of cards, since AGP is set to die off next year and with PCI Express now gaining popularity on both AMD and Intel systems, it won't be long before we see the death of the old and faithful AGP bus.
GeForce 6200 - The Card and OverclockingnVidia GeForce 6200 Reference Card
The nVidia reference card we were sent for evaluation represents what we can expect to see from vendors when the retail units arrive on the market sometime soon if not already depending on your location.
The PCB design is a compact as nVidia was able to make it. The 6200 core is remarkably small and requires less voltage than most of the GeForce 6 series cards - power requirements are small enough in that if it were used on the AGP specifications you would be able to get away without a Molex plug on the card. This is great news since PCI-E can deliver more voltage than the AGP specs which makes things a lot easier. The 6200 128-bit core version comes with an active heatsink and fan on the chip itself without any cooling on the memory modules. The fan itself runs almost silent. The 6200LE should be designed to run passively with lower clock speeds for silent PC operations. Even with the 6200 128-bit a large passive cooler could replace the active unit, as the GPU doesn't run that hot during operations. As you also can see this card is PCI Express based. nVidia has at this point no plans to release a AGP version. This however, doesn't mean we won't see them, as it simply requires adding in the HSI PCI Express bridge to the card to give AGP compatibility.
Under the heatsink we have the core itself. The core is manufactured on the 0.11um process in a Flip Chip Ball Grid Array or FC-BGA for short. The Flip Chip process was pioneered by Intel for the Pentium 3 Socket 370 CPU in order to reduce thermal loss and give a much better cooling profile. nVidia has used this for quite some time now with its GeForce 6 series GPU.The memory used is DDR Samsung memory rated at 3.6ns using the TSOP-II packaging. The memory doesn't clock as high as the ones on the 6600 cores, so BGA memory isn't needed, nor is any kind of cooling. The reference cards come with 128MB onboard with eight modules covering the front of the card with none on the back of the card, minimizing the amount of PCB required and being clocked at 550DDR you get up to 8.8GB/s of memory bandwidth.
Finally we have a look at the external interface. nVidia has placed the two monitor connectors together with the S-VIDEO port to the top. This has become the standard for nVidia cards for some time and looks to be for some time to come.Overclocking
When it came to overclocking, the 6200 reference card was not the best overclocking unit in the world. We managed to get from 300MHz default core to 332 MHz using the stock cooler. Memory was somewhat of a disappointment, at 275MHz or 550MHz DDR we only managed to push it to 300MHz (or 600MHz DDR) which is not the best in the world let me assure you.Hopefully retail cards will have more refined cores and better memory modules to give that extra boost.
GeForce 6200 - Benchmarks - Test System Setup and FutureMarkTest System SetupProcessor
: Intel Pentium 560 (800MHz FSB) (Supplied by Spectrum Communications
: 2x 512MB DDR2-533 Micron Hard Disk
: 2x Maxtor Maxline III 250GB RAID (RAID on ICH5R and ICH6R)Motherboard
: MSI 925X Neo (Supplied by MSI Australia
: Microsoft Windows XP Professional SP2Drivers
: Microsoft DX9c, ATI Catalyst 4.10 and nVidia 66.933DMark2001 SEVersion and / or Patch Used:
330Developer Homepage: http://www.futuremark.comProduct Homepage: http://www.futuremark.com/products/3dmark2001/Buy It Here
3DMark2001 SE is a part of the popular 3DMark series. By combining DirectX 8.1 support with completely new graphics (including the GeForce4), it continues to provide benchmark results that empower you to make informed hardware assessments.
Here we see the older PCX5900 and the 6200 against each other. This is what the 6200 is aimed at replacing in the PCI Express low-end market as well as take on the X300 as the low-end king.3DMark03Version and / or Patch Used:
340Developer Homepage: http://www.futuremark.comProduct Homepage: http://www.futuremark.com/products/3dmark03/Buy It Here
3DMark03 is the latest version of the highly favored 3DMark series. By combining full DirectX9.0 support with completely new tests and graphics, 3DMark03 continues the legacy of being industry standard benchmark.Please Note: Due to recent events with the 3DMark03 series, we are adding results purely for those who are still in favor of 3DMark03. These results should not be taken too seriously and are only added for interest sakes.
As we go into the DX9 tests we see the 6200 increase its gap on the PCX5900 as well as ATI's Radeon X300.
GeForce 6200 - Benchmarks - Doom 3Doom 3Version and / or Patch Used:
1.0Timedemo or Level Used:
Custom TimedemoDeveloper Homepage: http://www.idsoftware.com Product Homepage: http://www.doom3.comBuy It Here
Doom 3 is the latest game to hit our test lab and is one of the most intensive games to dates. With our own custom time demo we are able to give a realistic rating on what kind of FPS you will be achieving.For more information on benchmarking Doom 3 we recommend you check out our extensive article regarding it here
Doom 3 puts the 6200 well in front of the PCX5900 and the X300. Though not really that playable at 1600x1200, it is still more playable then any of the other two.
GeForce 6200 - Benchmarks - Far CryFar CryVersion and / or Patch Used:
1.0Timedemo or Level Used:
DefaultDeveloper Homepage: http://www.crytek.comProduct Homepage: http://www.farcrygame.comBuy It Here
There is no denying that Far Cry is currently one of the most graphic intensive games on the market, utilizing PS2.0 technology (the latest versions support Shader Model 3.0 with DX9c) and offering an exceptional visual experience there is no denying that even some of the faster graphics cards struggle.
Here again the 6200 manages to take the lead.
GeForce 6200 - Benchmarks - Halo PCHalo PCVersion and / or Patch Used:
1.0Timedemo or Level Used:
DefaultDeveloper Homepage: http://www.bungie.netProduct Homepage: http://www.bungie.net/Games/HaloPC/Buy It Here
Though we have used Halo in a couple of benchmarks in the past, it has now found a permanent place in our Benchmark Suite. This is simply due to its support for the latest DirectX 9 API's to put some more stress on the system to determine the best of the best.
Halo being somewhat older doesn't show as much of a gap, however, the 6200 manages to keep the lead.
GeForce 6200 - Benchmarks - High QualityHigh Quality
All tests were run with 4 times FSAA and 8 times Antistrophic Filtering enabled with the screen resolution set to 1024 X 768.When running these aggressive detail settings in 1600 x 1200, games really do become unplayable and if people are after image quality they will have to drop the resolution back down to reach playable levels.1024x768 was chosen as it is still the preferred playable resolution with these settings enabled.
When running with high quality imagery we see that the 6200 has the edge with better memory bandwidth as well as a much cleaner GPU than the FX or the X300.
Doom 3 especially shows the power of the 6200 over the competitors when running higher quality settings.
GeForce 6200 - Final ThoughtsFinal Thoughts
nVidia really hasn't had a very good low-end card since the GeForce 4 MX series several years ago now. This was when nVidia showed that a low budget card could still give a reasonable gaming performance and we are pleased to tell you today that it has reasserted that belief with the GeForce 6200.PCI Express is now to become the graphics interface for some time. With expansion possibilities beyond the x16 slot, we aren't going to see this technology disappear anytime soon. The lack of budget products for this architecture has kept it out of the value segment, until now.nVidia has used its existing GeForce 6 technology, which has brought them back from the brink the FX series cards put them in, and shown us that with the right optimisations and setting, a good performance score isn't beyond that of the budget user who doesn't have all the money in the world to buy the latest and greatest.Overall the GeForce 6200 graphics card is something that nVidia really needed to stand out with as the budget sector simply had died off since the GeForce 4 MX died its slow death. The 6200 guarantees a good performance while in the process wiping out the ATI Radeon X300 in our benchmarks along with compatibility with the PCI express architecture, along with the 128-bit memory bus and DDR SDRAM support. You will find these cards coming out very shortly to a store near you.We have awarded nVidia our TweakTown "MUST HAVE" Best Value Award for their GeForce 6200