TweakTown

nVidia GeForce 6600GT goes AGP

A little while ago nVidia launched their GeForce 6600GT graphics card using the PCI Express interface. The GPU received great reviews around the web for impressive performance for the price but lacked one major thing - AGP support for those who had not yet made the jump to the new platform. nVidia has now released the AGP version of their 6600GT graphics card and today we take a close look at the product and see how it differs from the PCI Express version.
@TweakTown
Published Thu, Dec 16 2004 11:00 PM CST   |   Updated Tue, Apr 7 2020 12:26 PM CDT
Manufacturer: none

GeForce 6600GT AGP - Introduction

IntroductionIt is without doubt that the GeForce 6 family of products from nVidia now covers a wide range of product segments from entry level all the way up to the king of the hill. Currently there are six major families, with sub products under each family which cater for each product segment. We have in the direct line the newly released 6200, the 6600 and the 6800 GPU's all with different specifications but with major technology common among each - that being DirectX 9.0c compliance, Pixel Shader 3.0 engine and support for the latest PCI-E interface.Before we go directly into our topic of today's article let us explore the GeForce 6 family. At the budget end of the line the 6200 GPU comes in. At the moment we have the 6200 with a 128-bit memory interface and support for a 128MB frame buffer, a 4 Pixel Shader engine and 3 Vertex engines. At the sub line nVidia is expected to have the 6200LE. This new version supports a slower 64-bit memory interface, reducing the price and performance to a very budget card capable of taking on the X300SE from ATI. This series of card is set to only come out on the PCI-Express x16 bus with no plans to migrate into the AGP realm as in this section the FX5200 is taking care of business for nVidia right now.In the middle we have the 6600 series with two variants. First we have the 6600 core which supports a 128MB 128-bit frame buffer and it is the lowest clocked unit and available in PCI-Express only. The 6600GT is the top of the 6600 line which adds support for SLI (dual graphics cards for PCI-Express only) and increases the speed of the core and memory clock. PCI Express and now AGP variants are available which is what we will be talking about today.The final core is the 6800 which includes three variants. First off is 6800 standard which uses 12 Pixel and 6 vertex engines, supports up to 265MB of DDR SDRAM on a 256bit bus but SLI is not supported. The 6800GT is the middle of the line. This card increases core and memory clocks, adds support for SLI on PCI Express models and increases the Pixel count to 16. The 6800Ultra is the top dog for nVidia at the moment. It shares the same features as the 6800GT but increases the maximum amount of supported memory to 512MB (though no card has been released with this much memory yet) and increases core and memory clocks some more.Today we are taking a close look at the nVidia GeForce 6600GT in the AGP form and comparing it to its PCI Express x16 brother (which we took a closer look at here earlier this month) to see just what is has to offer and what has been changed.

GeForce 6600GT AGP - A Closter Look

6600GT AGP

The nVidia 6600GT GPU is one of the most popular GPU's currently available. Its price point is somewhat of a sweet spot for its performance. It has the ability to do SLI when used on the PCI Express bus, allowing you to almost double its performance, as well as being available on both buses it does have its advantages.

We have already had to a look at the 6600GT in some detail in our previous review comparing it to the X700 graphics card, so we won't go over what we have already looked at. Today we will be looking at the differences (if any) that exist between the PCI Express and the AGP versions.


- The Cards and their Differences

For our purposes we will be showing the reference nVidia 6600GT AGP compared to the Gigabyte 6600GT PCI Express x16 to show you some of the differences.



First off the layout of the cards differs somewhat due to the use of the HSI bridge that nVidia has used to implement AGP. The reference card is equipped with two heatsink units. The unit covering the GPU is a standard retail design heatsink with some Doom 3 imagery and a smaller unit is used to cover the HSI bridge.

The HSI bridge made its first (somewhat questionable, to some) appearance on nVidia's PCX graphics cards many months ago now. There its job was somewhat different than what we are seeing today but still is the exact same theory only used in reverse. When used on the PCX cards it stood as the bridge chip. The PCX cards were simply AGP designed GPU's that were bridged to work on the PCI Express interface which gave nVidia some more time to develop actual PCI Express chips. The HSI works by converting PCI Express x16 signals coming from the motherboard chipset and changes them into AGP 8x signals which the GPU's could read. This time however the process is reversed. The 6600GT GPU is actually designed with only a PCI Express x16 native bus, the HSI bridge this time converts the AGP 8x signals from the motherboard chipset and converts them to PCI Express x16 signals so the GPU can access them.

Overall the bridge design is simply a way of allowing nVidia to somewhat cut the costs of putting one GPU onto two different platforms. How? There is going to be an increased cost due to extra PCB and the actual chip, not to mention re-designing trace routes to add the extra chip. nVidia hopes to keep the costs of the 6600GT AGP to the same as the PCI Express models.


- Clock Speeds

Now let's have a look at the actual cores. The first photo below shows the 6600GT AGP core with the HSI bridge beside it. The one below is from the PCI Express card from Gigabyte. The cores are identical in that they have no physical changes and both versions of the 6600GT are clocked with a GPU core speed of 500MHz.





Now we come to memory. Both use GDDR3 type memory but nVidia has reduced the memory clock speed of the AGP version by a small amount. The PCI Express version comes with a 1GHz memory clock and the AGP variant lowers the memory speed to 900MHz. While the clock speed is slower, the modules are the same 2ns BGA modules which means in theory you could probably overclock to the same speed as the PCI Express version anyway.


- SLI and AGP don't mix



Here is where a few extra parts begin to distinguish themselves. One of the features missing from the AGP version is SLI. SLI allows certain nVidia PCI Express cards to be linked in theory allowing you to double the frame rate since each card renders half of the screen. AGP models lack this, as there are simply no dual AGP solutions available at the chipset level. If there were this feature would still be left off in order to reduce the price point of the card, we believe. Along with this the AGP version has an additional power connector that the PCI Express lacks. AGP slots can only deliver 50 watts on the 12v AGP rail whereas the PCI Express can deliver 75 watts. The AGP card comes with the addition power connector that you will need to plug in to get it up and running.


- The Back Panel



Finally we come to the back panel of the card. Normally you would expect a D-SUB, DVI-I and S-Video port on the back. The 6600GT AGP does Dual DVI-I. The fact is that LCD's are now coming fully equipped with DVI ports, and the space saving of LCD means you can use two of them a lot easier in smaller space compared to bulky CRT's. If you are planing to use this on CRT you can use the DVI to VGA adapter to convert the unit over to D-SUB port which is a great setup option actually.

GeForce 6600GT AGP - Benchmarks - Test System Setup and 3DMark03

Test System SetupsAGP Test SystemProcessor: Intel Pentium 4 560 (800MHz FSB) (Supplied by Spectrum Communications)Memory: 2x 512MB OCZ DDR-533 (Supplied by OCZ)Hard Disk: 2x Maxtor Maxline III 250GB (RAID 0) Motherboard: ABIT AS8 (Supplied by ABIT)Operating System: Microsoft Windows XP SP2Drivers: nVidia Forceware 66.93PCI Express Test SystemProcessor: Intel Pentium 4 560 (800MHz FSB) (Supplied by Spectrum Communications)Memory: 2x 512MB OCZ DDR-533 (Supplied by OCZ)Hard Disk: 2x Maxtor Maxline III 250GB (RAID 0)Motherboard: MSI 915P Combo (Supplied by MSI Australia)Operating System: Microsoft Windows XP SP2Drivers: nVidia Forceware 66.93In order to create an even playing field as possible, we used the same LGA775 CPU on both platforms as well as DDR so no advantages went with CPU or memory usage.As for the overclocked results for the 6600GT AGP, we simply overclocked to the same level as the PCI-E version.3DMark03Version and / or Patch Used: 350Developer Homepage: http://www.futuremark.comProduct Homepage: http://www.futuremark.com/products/3dmark03/Buy It Here
3DMark03 is the latest version of the highly favored 3DMark series. By combining full DirectX9.0 support with completely new tests and graphics, 3DMark03 continues the legacy of being industry standard benchmark.Please Note: Due to recent events with the 3DMark03 series, we are adding results purely for those who are still in favor of 3DMark03. These results should not be taken too seriously and are only added for interest sakes.
Due to the lower memory clock speed we see that the AGP card at stock clocks falls behind the PCI-E card even at the lower resolutions. When cranking up the speed we see even more problems. When overclocked to PCI-E speeds, we see that there is very little difference.

GeForce 6600GT AGP - Benchmarks - 3DMark05

3DMark05Version and / or Patch Used: 1.1.0Developer Homepage: http://www.futuremark.comProduct Homepage: http://www.futuremark.com/products/3dmark05/Buy It Here
3DMark05 is the latest version in the popular 3DMark "Gamers Benchmark" series. It includes a complete set of DX9 benchmarks which tests Shader Model 2.0 and higher.For more information on the 3DMark05 benchmark, we recommend you read our preview here.
Same story with 3DMark05 as was with 03. At stock clocks we see a serious performance hit, at PCI-E speeds we see it performing almost as good.

GeForce 6600GT AGP - Benchmarks - Doom 3

Doom 3Version and / or Patch Used: 1.0 Timedemo or Level Used: Custom TimedemoDeveloper Homepage: http://www.idsoftware.com Product Homepage: http://www.doom3.comBuy It Here
Doom 3 is the latest game to hit our test lab and is one of the most intensive games to dates. With our own custom time demo we are able to give a realistic rating on what kind of FPS you will be achieving.For more information on benchmarking Doom 3 we recommend you check out our extensive article regarding it here.
Doom 3 shows the gap even more due to its extreme reliance on the video processor.

GeForce 6600GT AGP - Benchmarks - Far Cry

Far CryVersion and / or Patch Used: 1.3Timedemo or Level Used: DefaultDeveloper Homepage: http://www.crytek.comProduct Homepage: http://www.farcrygame.comBuy It Here
There is no denying that Far Cry is currently one of the most graphic intensive games on the market, utilizing PS2.0 technology (the latest versions support Shader Model 3.0 with DX9c) and offering an exceptional visual experience there is no denying that even some of the faster graphics cards struggle.
Far Cry doesn't use a much power out of the GPU's now as it did when first released, so the scores level out even with the 6600GT AGP at stock speeds it isn't as big a gap as the other results.

GeForce 6600GT AGP - Benchmarks - Unreal Tournament 2004

Unreal Tournament 2004Version and / or Patch Used: 1.0Timedemo or Level Used: as_convoyDeveloper Homepage: http://www.atari.comProduct Homepage: http://www.unrealtournament.com/ut2004/Buy It Here
Unreal Tournament 2004 or UT2004 for short is the latest instalment to the Unreal Tournament series. The full version of the game is based on DX9 (the demo only uses DX8.1 like UT2003) and has faced quite a big make over and is a lot more intensive then its predecessor.
Unreal Tournament 2004 doesn't show much difference at all even at 1600x1200.

GeForce 6600GT AGP - Benchmarks - High Quality

High QualityAll tests were run with 4 times FSAA and 4 times Antistrophic Filtering enabled with the screen resolution set to 1024 X 768.When running these aggressive detail settings in 1600 x 1200, games really do become unplayable and if people are after image quality they will have to drop the resolution back down to reach playable levels.1024x768 was chosen as it is still the preferred playable resolution with these settings enabled.
When all the high quality options are selected we see that the stock clocks really let the AGP version down.

GeForce 6600GT AGP - Final Thoughts

Final ThoughtsWith the current market split between AGP and PCI-E interfaces, it's certainly hard for companies like nVidia to really put the hard push on just one standard. While supporting PCI-E only will give a lot of future validation support to their credit, the AGP market is much stronger, and this would alienate a lot of AMD Athlon 64 users as well as Athlon XP users who are mostly limited to AGP.And supporting AGP only simply won't do either. This would remove you from the new Pentium 4 market as well as the Athlon 64 PCI-E segment and with AGP set for death, the results would be devastating. Both ATI and nVidia have their own ways of going about this, and with nVidia's move to native PCI-E and bridged AGP, we can say that this is more of the option we would like to see.Native PCI-Express support allows for a cheaper overall package. With prices on DDR-2, PCI-E boards and CPU's for these new technologies up there, the cards need to be at a reasonable level for affordability. With AGP set for death, bridging the PCI-E card to run AGP is a much better idea, at it allows for compatibility, good performance and more incentive for users to go PCI-E.The 6600GT AGP is nVidia's first PCI-E to AGP solution, and in all honesty, when looking at keeping the price at the PCI-E cards price range, a small clock performance hit is a small price to pay to have the latest technology in your AGP system. With no memory bus cuts, no cannibalising of the GPU die or any major modifications other than memory speeds, it is definitely something that nVidia has to be given credit for. To gain that PCI-E power, overclocking is an option, as the components are identical, and easy to clock to the PCI-E level. If you want a mid-range AGP card you can't go past the nVidia GeForce 6600GT AGP since you have no mid-range level (X600 and X700) options from ATI because they are concentrating more on PCI-E which is probably fair enough considering the state of AGP at the moment.The last question - will ATI respond with an X700 AGP bridge chip version?

PRICING: You can find products similar to this one for sale below.

USUnited States: Find other tech and computer products like this over at Amazon.com

UKUnited Kingdom: Find other tech and computer products like this over at Amazon.co.uk

AUAustralia: Find other tech and computer products like this over at Amazon.com.au

CACanada: Find other tech and computer products like this over at Amazon.ca

DEDeutschland: Finde andere Technik- und Computerprodukte wie dieses auf Amazon.de

We openly invite the companies who provide us with review samples / who are mentioned or discussed to express their opinion. If any company representative wishes to respond, we will publish the response here. Please contact us if you wish to respond.
Newsletter Subscription

Latest News

View More News

Latest Reviews

View More Reviews

Latest Articles

View More Articles