Game graphics have evolved at a breakneck pace over the past few decades. We're now at the point where certain materials - metal, glass, ceramic, or cloth - are rendered with accurate and believable detail. In this way, you know the object you're supposed to be looking at because there's depth and detail.
However, creating the look and feel of a real-world material is time-consuming and requires a lot of computational power - driven by the GPU. If you've watched a recent game "compile shaders," after firing it up for the first time, this is why. The alternative would be watching a game stuffer or freeze every time a new material or object enters a detailed 3D world.
At SIGGRAPH 2024, NVIDIA presented new technology called Real-Time Neural Appearance Models, a "complete real-time neural materials system" that aims to add film-like quality to real-time visuals while dramatically improving shading performance versus traditional rendering. Translation: In-game objects will look better than ever.
One of the examples presented by NVIDIA is a simple teapot made of ceramic with metallic properties and even fingerprints. "The AI material model accurately learns the details of the ceramic, the imperfect clear-coat glaze, fingerprints, smudges, and dust," NVIDIA writes. Our neural model is capable of preserving these complex material properties while being faster to evaluate than traditional multilayered materials."
When you look at a reference texture that could be as big as 16K by 16K, it's remarkable that it can recreate that level of detail so efficiently.
Even though it works as part of NVIDIA's real-time path tracer and uses Tensor Cores, the performance increase could lead to cutting-edge RT technology running faster and more efficiently on current GeForce RTX GPU hardware. And this isn't something that's GeForce RTX 4090-only, as NVIDIA's paper includes data captured with a GeForce RTX 2080 Ti.
The paper, which is quite technical, includes data showing that the more material you add to a scene, the longer the rendering times stay constant. This performance improvement is up to 24 times faster than using traditional shaders with multiple layers of complexity. Yes, this could be the next DLSS-like game-changer for game graphics.
Injecting neural networks into the rendering pipeline is fascinating and groundbreaking. With results like this, it's a clear sign that AI will become a crucial part of real-time graphics moving forward. If nothing else, path tracing in future games will look better and run substantially faster.
And suppose you're not a fan of the performance hit that comes from ray-tracing. In that case, NVIDIA notes that its Real-Time Neural Appearance Models technology is also suitable and more efficient at creating "baked" lighting and material properties "offline" for real-time graphics running on lower-end hardware. So it will save game developers time, too, a win-win.