GeForce 6600GT AGP - Introduction
GeForce 6600GT AGP - A Closter Look6600GT AGP
The nVidia 6600GT GPU is one of the most popular GPU's currently available. Its price point is somewhat of a sweet spot for its performance. It has the ability to do SLI when used on the PCI Express bus, allowing you to almost double its performance, as well as being available on both buses it does have its advantages.
We have already had to a look at the 6600GT in some detail in our previous review comparing it to the X700 graphics card, so we won't go over what we have already looked at. Today we will be looking at the differences (if any) that exist between the PCI Express and the AGP versions.
- The Cards and their Differences
For our purposes we will be showing the reference nVidia 6600GT AGP compared to the Gigabyte 6600GT PCI Express x16 to show you some of the differences.
First off the layout of the cards differs somewhat due to the use of the HSI bridge that nVidia has used to implement AGP. The reference card is equipped with two heatsink units. The unit covering the GPU is a standard retail design heatsink with some Doom 3 imagery and a smaller unit is used to cover the HSI bridge.
The HSI bridge made its first (somewhat questionable, to some) appearance on nVidia's PCX graphics cards many months ago now. There its job was somewhat different than what we are seeing today but still is the exact same theory only used in reverse. When used on the PCX cards it stood as the bridge chip. The PCX cards were simply AGP designed GPU's that were bridged to work on the PCI Express interface which gave nVidia some more time to develop actual PCI Express chips. The HSI works by converting PCI Express x16 signals coming from the motherboard chipset and changes them into AGP 8x signals which the GPU's could read. This time however the process is reversed. The 6600GT GPU is actually designed with only a PCI Express x16 native bus, the HSI bridge this time converts the AGP 8x signals from the motherboard chipset and converts them to PCI Express x16 signals so the GPU can access them.
Overall the bridge design is simply a way of allowing nVidia to somewhat cut the costs of putting one GPU onto two different platforms. How? There is going to be an increased cost due to extra PCB and the actual chip, not to mention re-designing trace routes to add the extra chip. nVidia hopes to keep the costs of the 6600GT AGP to the same as the PCI Express models.
- Clock Speeds
Now let's have a look at the actual cores. The first photo below shows the 6600GT AGP core with the HSI bridge beside it. The one below is from the PCI Express card from Gigabyte. The cores are identical in that they have no physical changes and both versions of the 6600GT are clocked with a GPU core speed of 500MHz.
Now we come to memory. Both use GDDR3 type memory but nVidia has reduced the memory clock speed of the AGP version by a small amount. The PCI Express version comes with a 1GHz memory clock and the AGP variant lowers the memory speed to 900MHz. While the clock speed is slower, the modules are the same 2ns BGA modules which means in theory you could probably overclock to the same speed as the PCI Express version anyway.
- SLI and AGP don't mix
Here is where a few extra parts begin to distinguish themselves. One of the features missing from the AGP version is SLI. SLI allows certain nVidia PCI Express cards to be linked in theory allowing you to double the frame rate since each card renders half of the screen. AGP models lack this, as there are simply no dual AGP solutions available at the chipset level. If there were this feature would still be left off in order to reduce the price point of the card, we believe. Along with this the AGP version has an additional power connector that the PCI Express lacks. AGP slots can only deliver 50 watts on the 12v AGP rail whereas the PCI Express can deliver 75 watts. The AGP card comes with the addition power connector that you will need to plug in to get it up and running.
- The Back Panel
Finally we come to the back panel of the card. Normally you would expect a D-SUB, DVI-I and S-Video port on the back. The 6600GT AGP does Dual DVI-I. The fact is that LCD's are now coming fully equipped with DVI ports, and the space saving of LCD means you can use two of them a lot easier in smaller space compared to bulky CRT's. If you are planing to use this on CRT you can use the DVI to VGA adapter to convert the unit over to D-SUB port which is a great setup option actually.
GeForce 6600GT AGP - Benchmarks - Test System Setup and 3DMark03Test System SetupsAGP Test SystemProcessor: Intel Pentium 4 560 (800MHz FSB) (Supplied by Spectrum Communications)Memory: 2x 512MB OCZ DDR-533 (Supplied by OCZ)Hard Disk: 2x Maxtor Maxline III 250GB (RAID 0) Motherboard: ABIT AS8 (Supplied by ABIT)Operating System: Microsoft Windows XP SP2Drivers: nVidia Forceware 66.93PCI Express Test SystemProcessor: Intel Pentium 4 560 (800MHz FSB) (Supplied by Spectrum Communications)Memory: 2x 512MB OCZ DDR-533 (Supplied by OCZ)Hard Disk: 2x Maxtor Maxline III 250GB (RAID 0)Motherboard: MSI 915P Combo (Supplied by MSI Australia)Operating System: Microsoft Windows XP SP2Drivers: nVidia Forceware 66.93In order to create an even playing field as possible, we used the same LGA775 CPU on both platforms as well as DDR so no advantages went with CPU or memory usage.As for the overclocked results for the 6600GT AGP, we simply overclocked to the same level as the PCI-E version.3DMark03Version and / or Patch Used: 350Developer Homepage: http://www.futuremark.comProduct Homepage: http://www.futuremark.com/products/3dmark03/Buy It Here
GeForce 6600GT AGP - Benchmarks - 3DMark053DMark05Version and / or Patch Used: 1.1.0Developer Homepage: http://www.futuremark.comProduct Homepage: http://www.futuremark.com/products/3dmark05/Buy It Here
GeForce 6600GT AGP - Benchmarks - Doom 3Doom 3Version and / or Patch Used: 1.0 Timedemo or Level Used: Custom TimedemoDeveloper Homepage: http://www.idsoftware.com Product Homepage: http://www.doom3.comBuy It Here
GeForce 6600GT AGP - Benchmarks - Far CryFar CryVersion and / or Patch Used: 1.3Timedemo or Level Used: DefaultDeveloper Homepage: http://www.crytek.comProduct Homepage: http://www.farcrygame.comBuy It Here
GeForce 6600GT AGP - Benchmarks - Unreal Tournament 2004Unreal Tournament 2004Version and / or Patch Used: 1.0Timedemo or Level Used: as_convoyDeveloper Homepage: http://www.atari.comProduct Homepage: http://www.unrealtournament.com/ut2004/Buy It Here
GeForce 6600GT AGP - Benchmarks - High QualityHigh QualityAll tests were run with 4 times FSAA and 4 times Antistrophic Filtering enabled with the screen resolution set to 1024 X 768.When running these aggressive detail settings in 1600 x 1200, games really do become unplayable and if people are after image quality they will have to drop the resolution back down to reach playable levels.1024x768 was chosen as it is still the preferred playable resolution with these settings enabled.
GeForce 6600GT AGP - Final ThoughtsFinal ThoughtsWith the current market split between AGP and PCI-E interfaces, it's certainly hard for companies like nVidia to really put the hard push on just one standard. While supporting PCI-E only will give a lot of future validation support to their credit, the AGP market is much stronger, and this would alienate a lot of AMD Athlon 64 users as well as Athlon XP users who are mostly limited to AGP.And supporting AGP only simply won't do either. This would remove you from the new Pentium 4 market as well as the Athlon 64 PCI-E segment and with AGP set for death, the results would be devastating. Both ATI and nVidia have their own ways of going about this, and with nVidia's move to native PCI-E and bridged AGP, we can say that this is more of the option we would like to see.Native PCI-Express support allows for a cheaper overall package. With prices on DDR-2, PCI-E boards and CPU's for these new technologies up there, the cards need to be at a reasonable level for affordability. With AGP set for death, bridging the PCI-E card to run AGP is a much better idea, at it allows for compatibility, good performance and more incentive for users to go PCI-E.The 6600GT AGP is nVidia's first PCI-E to AGP solution, and in all honesty, when looking at keeping the price at the PCI-E cards price range, a small clock performance hit is a small price to pay to have the latest technology in your AGP system. With no memory bus cuts, no cannibalising of the GPU die or any major modifications other than memory speeds, it is definitely something that nVidia has to be given credit for. To gain that PCI-E power, overclocking is an option, as the components are identical, and easy to clock to the PCI-E level. If you want a mid-range AGP card you can't go past the nVidia GeForce 6600GT AGP since you have no mid-range level (X600 and X700) options from ATI because they are concentrating more on PCI-E which is probably fair enough considering the state of AGP at the moment.The last question - will ATI respond with an X700 AGP bridge chip version?
PRICING: You can find products similar to this one for sale below.
United States: Find other tech and computer products like this over at Amazon.com
United Kingdom: Find other tech and computer products like this over at Amazon.co.uk
Australia: Find other tech and computer products like this over at Amazon.com.au
Canada: Find other tech and computer products like this over at Amazon.ca
Deutschland: Finde andere Technik- und Computerprodukte wie dieses auf Amazon.de