The system used for this preview consists of an Intel Core 2 Duo E6700 CPU running at 1,333MHz quad-pumped bus speed, an ASUS P5B Deluxe motherboard, 2GB of DDR2 RAM clocked at 800MHz, and a now rather old Radeon X1800 Pro graphics card. This card works pretty well in most games, but once RyderMark was fired up the card seemed like it something out of the history books. This benchmark will work under Windows Vista operating system but it is only certified for Windows XP since "it does not work well in Vista due to poor GPU drivers."
There are three settings depending on how much graphics memory you have, 128, 256 and 512MB or more. This is due to the size of the textures used and a graphics card that does not have sufficient graphics memory will not be able to use the higher resolution textures. Interestingly we had some issues with tearing and missing textures at higher resolutions, even though the 256MB option was selected. It seems like Candella Software still has some work to do before the final release of RyderMark.
One interesting thing is that there seems to be a huge difference between 32-bit and 64-bit HDR lighting, as the 64-bit option looks a lot more vivid than the 32-bit setting. Enabling features such as motion blur, shadowing and dynamic reflections really killed the performance, as you will see in the graph below. The first test is with no Anti-Aliasing or Anisotropic Filtering, and with the three previous options also disabled, while in the other two benchmark results, those settings are enabled.
The two screenshots below show the difference between 32-bit and 64-bit HDR. The full-resolution screenshots are at 1680x1050 resolution for those that want to take a closer look at the details.