With AI prevalent in the high-performance computing space, and NVIDIA's cutting-edge hardware pushing the company's worth to new heights, CEO Jensen Huang began his Computex 2023 keynote with a slice of RTX meets AI with a glimpse at the potential future of gaming. It's called NVIDIA Avatar Cloud Engine (ACE), an AI model that "transforms games by bringing intelligence to non-playable characters (NPCs) through AI-powered natural language interactions."
Imagine a Cyberpunk 2077 Night City open-world brought to life in Unreal Engine 5, with stunning ray tracing effects, where a futuristic ramen bar is a home to global illumination and lighting that looks as breathtaking as anything seen in CD Project Red's new Path Traced RT Overdrive mode.
The difference? The person behind the bar is bright to life with generative AI language models and can respond to your voice in a way reminiscent of real-world role-playing. And funnily enough the earliest days of gaming with tabletop Dungeons and Dragons.
Check it out.
Is it perfect? No, but it is a stunning glimpse at a future where NPCs are given backstories and personalities and can engage in realistic conversations based on player-generated input. No more dialogue trees, just an everyday conversation between an adventure and a potential quest-giver. We could see this sort of tech making its way to The Elder Scrolls VI, a series that has always looked at adding dynamic realism to even the most inconsequential NPCs roaming around Tamriel.
"Generative AI has the potential to revolutionize the interactivity players can have with game characters and dramatically increase immersion in games," said John Spitzer, vice president of developer and performance technology at NVIDIA. "Building on our expertise in AI and decades of experience working with game developers, NVIDIA is spearheading the use of generative AI in games."
NVIDIA collaborated with startup Convai to create the Unreal Engine 5 demo, which of course, leverages NVIDIA's most well-known bit of AI tech for gaming DLSS. NVIDIA notes that developers will be able to make use of NVIDIA ACE for Games soon, and what is perhaps one of the most impressive parts of the demo is the low latency for the responses, which adds a definite dose of naturalism to the conversation.
ACE makes use of NVIDIA Omniverse Audio2Face, which is used to generate facial expressions on NPCs to match voice and speech tracks, and it's tech that is going to be a part of the highly anticipated S.T.A.L.K.E.R. 2: Heart of Chernobyl from GSC Game World, and the indie sci-fi Fort Solis from Fallen Leaf.