The nVidia 6600GT GPU is one of the most popular GPU's currently available. Its price point is somewhat of a sweet spot for its performance. It has the ability to do SLI when used on the PCI Express bus, allowing you to almost double its performance, as well as being available on both buses it does have its advantages.
We have already had to a look at the 6600GT in some detail in our previous review comparing it to the X700 graphics card, so we won't go over what we have already looked at. Today we will be looking at the differences (if any) that exist between the PCI Express and the AGP versions.
- The Cards and their Differences
For our purposes we will be showing the reference nVidia 6600GT AGP compared to the Gigabyte 6600GT PCI Express x16 to show you some of the differences.
First off the layout of the cards differs somewhat due to the use of the HSI bridge that nVidia has used to implement AGP. The reference card is equipped with two heatsink units. The unit covering the GPU is a standard retail design heatsink with some Doom 3 imagery and a smaller unit is used to cover the HSI bridge.
The HSI bridge made its first (somewhat questionable, to some) appearance on nVidia's PCX graphics cards many months ago now. There its job was somewhat different than what we are seeing today but still is the exact same theory only used in reverse. When used on the PCX cards it stood as the bridge chip. The PCX cards were simply AGP designed GPU's that were bridged to work on the PCI Express interface which gave nVidia some more time to develop actual PCI Express chips. The HSI works by converting PCI Express x16 signals coming from the motherboard chipset and changes them into AGP 8x signals which the GPU's could read. This time however the process is reversed. The 6600GT GPU is actually designed with only a PCI Express x16 native bus, the HSI bridge this time converts the AGP 8x signals from the motherboard chipset and converts them to PCI Express x16 signals so the GPU can access them.
Overall the bridge design is simply a way of allowing nVidia to somewhat cut the costs of putting one GPU onto two different platforms. How? There is going to be an increased cost due to extra PCB and the actual chip, not to mention re-designing trace routes to add the extra chip. nVidia hopes to keep the costs of the 6600GT AGP to the same as the PCI Express models.
- Clock Speeds
Now let's have a look at the actual cores. The first photo below shows the 6600GT AGP core with the HSI bridge beside it. The one below is from the PCI Express card from Gigabyte. The cores are identical in that they have no physical changes and both versions of the 6600GT are clocked with a GPU core speed of 500MHz.
Now we come to memory. Both use GDDR3 type memory but nVidia has reduced the memory clock speed of the AGP version by a small amount. The PCI Express version comes with a 1GHz memory clock and the AGP variant lowers the memory speed to 900MHz. While the clock speed is slower, the modules are the same 2ns BGA modules which means in theory you could probably overclock to the same speed as the PCI Express version anyway.
- SLI and AGP don't mix
Here is where a few extra parts begin to distinguish themselves. One of the features missing from the AGP version is SLI. SLI allows certain nVidia PCI Express cards to be linked in theory allowing you to double the frame rate since each card renders half of the screen. AGP models lack this, as there are simply no dual AGP solutions available at the chipset level. If there were this feature would still be left off in order to reduce the price point of the card, we believe. Along with this the AGP version has an additional power connector that the PCI Express lacks. AGP slots can only deliver 50 watts on the 12v AGP rail whereas the PCI Express can deliver 75 watts. The AGP card comes with the addition power connector that you will need to plug in to get it up and running.
- The Back Panel
Finally we come to the back panel of the card. Normally you would expect a D-SUB, DVI-I and S-Video port on the back. The 6600GT AGP does Dual DVI-I. The fact is that LCD's are now coming fully equipped with DVI ports, and the space saving of LCD means you can use two of them a lot easier in smaller space compared to bulky CRT's. If you are planing to use this on CRT you can use the DVI to VGA adapter to convert the unit over to D-SUB port which is a great setup option actually.
Find the lowest price on nVidia GeForce 6600GT Graphics Cards!