Sapphire Radeon X1900XT Graphics Card

Mike checks out the Radeon X1900XT graphics card and compares it against the Radeon X850XT PE to see what has changed.

Manufacturer: Sapphire Technology
14 minutes & 41 seconds read time


For several years now, the name Sapphire has meant blistering fast ATI graphics. As a company dedicated to the ATI graphics chipset, Sapphire has made for themselves a reputation of putting out a product that not only meets the factory standards, but often exceeds them.

No matter how well a manufacturer does in the past makes no difference when it comes to a new product to test out. Today's contestant is the Sapphire X1900XT graphics card. Using the newer ATI R580 chipset and fitted with 512MB of onboard memory, we're going to put it through its paces and see how it compares to graphics boards using the Radeon X8xx series chipset. We'll take a look at features and performance numbers and try to help you decide whether this product would make a worthy upgrade in your own enthusiast rig.

So relax for a bit as we take a closer look at the X1900XT graphics card from Sapphire. They don't give these boards away so we want to see if it's worth the entry fee!


The Spec Sheet

Yes, it is time for those pesky specifications again. While not everyone cares about this information, it is always best to have as much knowledge about a component as possible. That said, this is a quick rundown of the X1900 series product spec sheet.

Radeon X1900 Graphics Technology - Specifications

384 million transistors on 90nm fabrication process
48 pixel shader processors
8 vertex shader processors
256-bit 8-channel GDDR3 memory interface
Native PCI Express x16 bus interface

Ring Bus Memory Controller
512-bit internal ring bus for memory reads
Fully associative texture, color, and Z/stencil cache designs
Hierarchical Z-buffer with Early Z test
Lossless Z Compression (up to 48:1)
Fast Z-Buffer Clear
Optimized for performance at high display resolutions, including widescreen HDTV resolutions

Ultra-Threaded Shader Engine
Support for Microsoft DirectX 9.0 Shader Model 3.0 programmable vertex and pixel shaders in hardware
Full speed 128-bit floating point processing for all shader operations
Up to 512 simultaneous pixel threads
Dedicated branch execution units for high performance dynamic branching and flow control
Dedicated texture address units for improved efficiency
3Dc+ texture compression o High quality 4:1 compression for normal maps and two-channel data formats
High quality 2:1 compression for luminance maps and single-channel data formats
Complete feature set also supported in OpenGL 2.0

Advanced Image Quality Features
64-bit floating point HDR rendering supported throughout the pipeline
Includes support for blending and multi-sample anti-aliasing
32-bit integer HDR (10:10:10:2) format supported throughout the pipeline
Includes support for blending and multi-sample anti-aliasing
2x/4x/6x Anti-Aliasing modes
Multi-sample algorithm with gamma correction, programmable sparse sample patterns, and centroid sampling
New Adaptive Anti-Aliasing feature with Performance and Quality modes
Temporal Anti-Aliasing mode
Lossless Color Compression (up to 6:1) at all resolutions, including widescreen HDTV resolutions
2x/4x/8x/16x Anisotropic Filtering modes
Up to 128-tap texture filtering
Adaptive algorithm with Performance and Quality options
High resolution texture support (up to 4k x 4k)

Avivo™ Video and Display Platform
High performance programmable video processor
Accelerated MPEG-2, MPEG-4, DivX, WMV9, VC-1, and H.264 decoding and transcoding
DXVA support
De-blocking and noise reduction filtering
Motion compensation, IDCT, DCT and color space conversion
Vector adaptive per-pixel de-interlacing
3:2 pulldown (frame rate conversion)
Seamless integration of pixel shaders with video in real time
HDR tone mapping acceleration
Maps any input format to 10 bit per channel output
Flexible display support
Dual integrated dual-link DVI transmitters
DVI 1.0 compliant / HDMI interoperable and HDCP ready*
Dual integrated 10 bit per channel 400 MHz DACs
16 bit per channel floating point HDR and 10 bit per channel DVI output
Programmable piecewise linear gamma correction, color correction, and color space conversion (10 bits per color)
Complete, independent color controls and video overlays for each display
High quality pre- and post-scaling engines, with underscan support for all outputs
Content-adaptive de-flicker filtering for interlaced displays
Xilleon™ TV encoder for high quality analog output
YPrPb component output for direct drive of HDTV displays
Spatial/temporal dithering enables 10-bit color quality on 8-bit and 6-bit displays
Fast, glitch-free mode switching
VGA mode support on all outputs
Drive two displays simultaneously with independent resolutions and refresh rates
Compatible with ATI TV/Video encoder products, including Theater 550

In The Box

Out test subject on the bench today is the full retail version of the Radeon X1900XT graphics card. After the plastic is removed and the box emptied, this is what you can expect to see. Included is the card, a small instruction manual, a driver disk, the Sapphire overclocking utility, a video editing utility, a software DVD utility, all necessary cabling to make use of the VIVO capabilities of the card, a splitter with PCI-E port and two DVI adapters. One thing I like about the Sapphire product is that they don't waste a lot of time on extra stuff. You might see the occasional game included with their retail bundle, but not a lot of fluff as a rule. After all, the odds are very good that you already have the games you like to play and are getting an upgrade card so you can better utilize them.

When you first look at this card, the first thing that comes into mind is BIG! I mean, this thing is just huge, but that might be due to the 2x slot layout included to make room for the oversized cooler. This has been pretty much standard fare for ATI products since the X850 series hit the market. While it looks intimidating, it isn't as loud as you might think. There is a thermal diode built into the component that allows the fan to only hit higher RPMs when the temperatures get out of line. Even under intense gaming, I never heard the fan hit max speed except for boot-up. Yes, this monster will let you know it is awake, then after a few seconds slow back down to a reasonable speed - much like a Shuttle XPC at boot. Now don't get me wrong, it isn't anywhere near silent, it just isn't loud to the point where it becomes unbearable. Unless you have larger fans on your processor, you will know when this machine is running.

Since we have broken into the X1900 series, we have the luxury of the new R580 VPU. This brings to the table a second-generation graphics card capable of making full use of DirectX 9 Shader Model 3. It also includes 48 pixel shaders and 8 vertex shaders. For those interested in how this compares to the nVidia GeForce 7900 series, they have the same number of vertex shaders and only 24 pixel shaders. There is debate on the ratios used by the two manufacturers, but it all comes down to one thing; high performance.

Another nice feature of the R580 VPU is the support for HDR. This is a newer technology and the acronym stands for "High Dynamic Range" and does wonders for creating more natural lighting and shadow effects. This will come in particularly handy for those who are looking for an upgrade to handle the new TES:Oblivion game as it uses this new technology.

Turning the card over shows a pretty clean layout. In years past, many manufacturers put memory modules on the back side of their PCB, but fortunately this doesn't happen here. While it is a convenient place to get things in an out of the way location, keeping those fast modules cool is a nightmare. RAM sinks would pretty much be out of the question for many users, particularly those with a desire to run this card in a Crossfire dual graphics configuration.

The bracket area of the card is quite minimalistic. It consists of two DVI ports and a S-Video connector. Then again, you will probably want all that room used for the cooler, as this is the whole reason for having a 2-slot design. Let's take a peek at the back side of the cooling setup.

For those who are concerned about not having a digital monitor, fear not. If you'll recall the photo above of the contents of the box, you may remember seeing a couple of adapters. Simply attach one to the 15-pin D-Sub connector of your analog monitor and connect that to the graphics card. I played around on a CRT to test this before moving over to the 19" LCD for formal testing and all works without a hitch.

If you're thinking that this beast should be heavy with that much copper in use, you would be 100% correct in that line of thought. Just like cooling a processor, copper displays a higher heat dissipation rate than aluminum. Since there are no heat pipes in use, the copper is a necessity in the cooling department.

Now that we've covered (albeit briefly) some of the high points of this little gem, let's take a look at how this thing performs. Since my main goal is an upgrade path, I'll compare this new generation VPU against the ATI Radeon X850XT PE. While not a dinosaur by any means, it will give us a good look at some performance numbers against a known workhorse, so we will have a better idea as to what the new kid on the block is capable of.

Benchmarks - Test Setup System and Methodology

Test System Setup and Methodology

When it comes to testing graphics cards, you have to take a look at what the board was designed to do. This usually falls into two categories; gaming or CAD. Since we're talking about a performance gaming card here, we'll be looking primarily at how it performs in both synthetic and real-world gaming tests.

To accommodate these goals, we're going to run a series of tests that consist of the following utilities and programs:

Futuremark 3DMark05
Futuremark 3DMark06
Quake III Arena
Quake 4
Unreal Tournament 2004 (full version)
Doom 3
Far Cry

All utilities and games have been patched to their most recent versions with the exception of Quake III Arena. I still use the original released version since anything after the first couple of patches forces the loss of the use of the standard Demo files. Color depth for all tests will be conducted at 32-bit so we can see those true colors in all their glory.

The test system will be running Windows XP Professional with DirectX 9.0c and SP2 installed as well as all critical updates in place. I am using the latest motherboard drivers as well as Catalyst 6.3 .

Test System Setup
DFI LANParty UT nF4 Ultra-D Motherboard
AMD Athlon 64 FX-53 Processor (Supplied by
2x 512MB Mushkin PC3500 "Redline" Memory (Supplied by Mushkin)
Thermaltake PurePower 600-watt PSU (Supplied by Thermaltake)
Princeton LCD19D 19" LCD Digital Monitor
Western Digital 80GB SATA Hard Drive

Benchmarks - 3DMark05


Version and / or Patch Used: Build 120
Developer Homepage:
Product Homepage:
Buy It Here

3DMark05 is now the second latest version in the popular 3DMark "Gamers Benchmark" series. It includes a complete set of DX9 benchmarks, which tests Shader Model 2.0 and higher.

For more information on the 3DMark05 benchmark, we recommend you read our preview here.

Beginning with some synthetic benchmarks, we see a drastic improvement. While not nearly as graphics card biased as the previous version of the utility, it still shows a dramatic difference in performance levels. For those not wanting to work the numbers, we're looking at a 60% increase in score at 1024x768 and a 64% increase when running at 1280x1024. This is beginning to look very promising!

Benchmarks - 3DMark06


Version and / or Patch Used: Build 102
Developer Homepage:
Product Homepage:
Buy It Here

3DMark06 is the very latest version of the "Gamers Benchmark" from Futuremark. The newest version of 3DMark expands on the tests in 3DMark05 by adding graphical effects using Shader Model 3.0 and HDR (High Dynamic Range lighting), which will push, even the best DX9 graphics cards to the extremes.

3DMark06 also focuses on not just the GPU but the CPU using the AGEIA PhysX software physics library to effectively test single and Dual Core processors.

Moving on to the latest 3DMark benchmark version shows an even greater increase in overall score. This is only partially due to pure power, however, as the X1900 utilizes HDR technology, which is tested in the 2006 version but not the 2005. Just like older 3DMark utilities, if your graphics card cannot complete the entire spectrum of tests, your score will suffer noticeably.

This still doesn't take away from the huge gap present in scores, however. A quick check on the calculator shows an increase of 103% at 1024x768 and an increase of 114% at 1280x1024.

Just to check AA/AF performance, I went ahead and ran the 1280 test with 4xAA and 8xAF forced through the driver set to see what sort of numbers we would achieve when getting rid of the "jaggies". From the overall score of 4737 shown above, we dropped to 4146 with the AA/AF effects in place. This shows a drop in performance of 14% from the non-enhanced test.

Benchmarks - Quake III Arena

Quake III Arena

Version and / or Patch Used: Default install
Timedemo or Level Used: Demo001
Developer Homepage:
Product Homepage:
Buy It Here

Quake III Arena is a real-world OpenGL benchmark that we have been using here at TweakTown for quite a while now because it has proven itself to be one of the best gaming benchmarks around to compare a wide range of different products. Quake III Arena is getting very old, but is still one of the best ways of testing video and PC systems for any instability and best performance hence the reason we are still using it today.

Some folks still enjoy seeing what can done with this oldie but goodie, so I am still including it in the test suite. I had thought that we had finally gotten to the limit of what a graphics board could do with this game engine, but it appears that I was mistaken. Both resolutions tested show a solid 12% increase in frames per second.

As a side note, all in game settings were at their absolute maximum during testing.

Benchmarks - Quake 4

Quake 4

Version and / or Patch Used: Default install
Timedemo or Level Used: necro666.demo (download here)
Developer Homepage:
Product Homepage:
Buy It Here

Quake 4 is one of the latest new games to be added to our benchmark suite. It is based off the popular Doom 3 engine and as a result uses many of the features seen in Doom. However, Quake 4 graphics are more intensive than Doom 3 and should put more strain on different parts of the system.

Quake 4 is a bit more on the modern side of things, so should give us an idea as to how well the x1900XT can handle the newer methods of rendering graphics. It doesn't take long to see a decent improvement, either. This game shows an increase of 17% at 1024x768 and an increase of 16% at 1280x1024. This goes a long way in letting us see how well the new kid on the block is going to handle more modern titles.

As an aside, I began playing a little with this benchmark. Since the X1900XT comes with an impressive 512MB of memory, I decided to use the "Ultra" setting within the game. This setting, just like the same setting in Doom 3, is designed for graphics card that have 512MB or more of onboard memory. At the 1024x768 resolution, the frame rate only dropped from 131.9 FPS to 128.4 FPS. This reflects only a 3% drop in frames when using the massive texture files within the game.

I also wanted to see how the AA/AF worked with a game using the Doom engine, so ran my normal 4xAA and 8xAF settings at the 1280x1024 resolution. The resulting score showed a drop in score from 126.7 to 110.1, or a 15% decrease. Even with this drop, the resulting 110 FPS is more than playable and also gives you a much smoother image.

Benchmarks - UT2004

Unreal Tournament 2004

Version and / or Patch Used: v3369
Timedemo or Level Used: ONS_Dria (download here)
Developer Homepage:
Product Homepage:
Buy It Here

Unreal Tournament 2004 or UT2004 for short is the latest installment to the Unreal Tournament series. The full version of the game is based on DX9 (the demo only uses DX8.1 like UT2003) and has faced quite a big make over and is a lot more intensive then its predecessor.

Unreal Tournament 2004 is the most graphics card insensitive game that I have yet to come across. It matters little what sort of graphics card you install, I have yet to see much of a difference in the frame rates produced. Even with the obvious horsepower advantage of the X1900 series board tested today, this theory is still holding true.

Benchmarks - Doom 3

Doom 3

Version and / or Patch Used: v1.3
Timedemo or Level Used: demo1
Developer Homepage:
Product Homepage:
Buy It Here

Doom 3 is the latest game to hit our test lab and is one of the most intensive games to dates. With our own custom time demo we are able to give a realistic rating on what kind of FPS you will be achieving.

For more information on benchmarking Doom 3 we recommend you check out our extensive article regarding it here.

Doom 3 is still a popular game and the engine used is still utilized in several popular games being brought into the market. Even though it uses the same game engine as Quake 4, we expect to see a little better performance in the Doom title due to new enhancements to the engine that were not around during the creation of Doom and also due to the amount of inside vs. outside maps used in the two games.

Looking at the results above, we see that the X1900XT is still able to give significant performance gains in this title. It is hard to turn your nose up to a 16% increase in frames at 1024x768 and an even more impressive 33% increase at 1280x1024. It is beginning to look like we've got a serious contender on our hands here.

Benchmarks - Far Cry

Far Cry

Version and / or Patch Used: v1.33
Timedemo or Level Used: PC Gamers Hardware Demo (download here)
Developer Homepage:
Product Homepage:
Buy It Here

There is no denying that Far Cry is currently one of the most graphic intensive games on the market, utilizing PS2.0 technology (the latest versions support Shader Model 3.0 with DX9c) and offering an exceptional visual experience there is no denying that even some of the faster graphics cards struggle.

Far Cry is our last game to put to the test today. Using an innovative game engine, this title has the ability to display some amazing graphics of the lush tropical atmosphere used in the game. While not as large a difference as some tests, we are still managing to capture an additional 10% frame rate when compared to the X850 card it is compared to. This is the smallest difference of all primary benchmarks used and still does a fair job of showing a noticeable difference in frames.

After this test had been finished up, I went back to the driver control console and forced AA/AF settings to give them a try on this game engine. I used the normal 4xAA and 8xAF settings then ran the same demo run. The result was a drop in average frame rate from 97.53 to 95.35. This works out to only a 2% decrease in frames, a very small difference to be sure.

Final Thoughts

After the smoke clears and the dust settles, we see ourselves looking at a very impressive graphics card indeed. I generally use the "Rule of 10" when looking at a viable upgrade. Simply put, if a new component can produce a minimum of 10% increase in performance across the spectrum, then it is worthy of consideration as an upgrade option. We see exactly this with the X1900XT series graphics card and then some. With Far Cry being the only benchmark that showed only a 10% increase in performance (not including the UT2004, which doesn't care about the card installed), we certainly have something to consider here.

With second generation support of SM3 and some eye candy loving HDR support included in this card, you're looking at something that will not only give awesome performance, but will also allow you to see the latest game titles in all their glory. Of course, if you happen to be one who loves the benchmark numbers, you won't be too disappointed either.

As far as pricing is concerned, I'm a little concerned about the XT model. While it runs about $100 USD cheaper than the X1800XT, it is only about $40 USD less than the vaunted X1900XTX model. I would like to see a more aggressive reduction in price between the two X1900 models. But then, I can see saving the few dollars and simply overclocking the XT to XTX speeds with relative ease. Oh, sorry, the price? About $450-ish is what you can expect to lay out for one of these things.

Bottom line... If you've been looking for a graphics that falls squarely into the "High-End" arena, you will find several choices. If you're looking for a card that kicks ass and is still cheaper than the top-end product on the market, BUT still manages to give you all the capabilities of the big brother, then let me introduce you to this Sapphire X1900XT. With performance that is simply amazing and support for all the latest and greatest rendering techniques being used now, you certainly won't be sorry when buying this little gem. It is loud, but it is strong.

- Pros
Excellent performance
512MB onboard memory
Second generation SM3 support
HDR allows for excellent lighting effects
AA/AF performance with minimal FPS loss

- Cons
Requires two slots
Rather loud

Rating - 9 out of 10 and TweakTown's "MUST HAVE" Best Performance Award!

PRICING: You can find products similar to this one for sale below.

USUnited States: Find other tech and computer products like this over at

UKUnited Kingdom: Find other tech and computer products like this over at

AUAustralia: Find other tech and computer products like this over at

CACanada: Find other tech and computer products like this over at

DEDeutschland: Finde andere Technik- und Computerprodukte wie dieses auf

Newsletter Subscription
We openly invite the companies who provide us with review samples / who are mentioned or discussed to express their opinion. If any company representative wishes to respond, we will publish the response here. Please contact us if you wish to respond.