Tech content trusted by users in North America and around the world
6,793 Reviews & Articles | 46,929 News Posts
TRENDING NOW: AMD rumored to launch its dual-GPU next month, the Radeon R9 Fury X2

nVidia GeForce FX 5900 Ultra - Fourth Time Lucky

For many months enthusiasts awaited the arrival of nVidia GeForce FX 5800 only to be let down by poor performance, jet engine level volume from its cooler and supply problems due to the use of DDR-II. nVidia acknowledged the FX 5800 was a failure and have recently released the GeForce FX 5900 Ultra with many improvements over its predecessor. Read on as Cameron "Sov" Johnson tells us if nVidia learned from their mistakes or not.

By: Cameron Johnson | NVIDIA GeForce GPU in Video Cards | Posted: Jun 11, 2003 4:00 am
TweakTown Rating: 9.0%Manufacturer: nVidia



It certainly has been a long and hard road for the 3D giant nVidia - Three product releases and not a winner amongst them. With what we have seen in the past from nVidia with high speed cards leading the market, it is certainly a first to see nVidia's newest products falling to the bottom of the order. Design wise, nVidia was to bring a new wave of 3D entertainment, but using Intel's idea that core speed is total, nVidia seems to have forgotten that the 3D world is fickle, remember Voodoo 2 and TNT2?


Just as graphics cards based on the GeForceFX 5800 chipset, known to us as the NV30, are finally making their way into the retail channel, nVidia is introducing its follow-up product. Following the company's nomenclature, the new high-end chip will be called nVidia GeForceFX 5900.


nVidia made a few mistakes in its first attempts on the FX line. First we will cover the Memory. Originally intended to use a 256bit interface, similar to the ATI Radeon 9700 and 9800, nVidia was unable to use the 256bit interface due to the unavailability of the Flip Chip package the 256bit interface was deemed to be un-usable, this led nVidia to outfit the card with a 128bit DDR-II interface - This was mistake number one.


Mistake number two was the use of the 0.13 micron technology. nVidia banked on the 0.13 micron technology would be fully mature and free of bugs. This was not the case and forced the NV30 release dates back many months and eventually led to supply problems recently. On top of that, nVidia's reference cooling solution proved to be unacceptably loud, earning the reference card a host of degrading names amongst enthusiasts. Lastly, the chip's image quality when using anisotropic filtering also didn't live up to expectations and was criticized in many reviews as a result.


When ATI launched its Radeon 9800 Pro, it was able to beat nVidia's flagship in practically every discipline. In short, the FX 5800/ NV30 is too loud, too expensive, offers sub-par image quality and is slower than its direct competitor, the Radeon 9800. However then nVidia introduced GeForceFX 5900 to take on the ATI giant, can it? Read on and find out.


    PRICING: You can find products similar to this one for sale below.

    United States: Find other tech and computer products like this over at Amazon's website.

    United Kingdom: Find other tech and computer products like this over at Amazon UK's website.

    Canada: Find other tech and computer products like this over at Amazon Canada's website.

    We at TweakTown openly invite the companies who provide us with review samples / who are mentioned or discussed to express their opinion of our content. If any company representative wishes to respond, we will publish the response here.
Got an opinion on this content? Post a comment below!
Subscribe to our Newsletter

Latest News Posts

View More News Posts

Forum Activity

View More Forum Posts

Press Releases

View More Press Releases