Nvidia’s new generative AI will revolutionize gaming development

AI is all the rage these days ever since OpenAI launched ChatGPT. Companies like Ubisoft are also looking to incorporate AI technology into game development.

The creation of character models is expected to be revolutionised by a new AI technology that Nvidia recently presented. The “Omniverse Avatar Cloud Engine” or ACE, which can produce live, interactive AI NPCs with vocal conversations and face animations, was introduced by Nvidia CEO Jensen Huang during the company’s keynote address at Computex 2023.

A new “suite of real-time solutions” from Nvidia called ACE allows game designers to build interactive characters that can react to user actions. These NPCs will have synchronised lipsync, voices, and facial expressions.

In the demo, a player uses his own voice to converse with the NPC. The NPC tries to understand what the player is saying and directs him to a fresh side mission that he can finish.

With the goal of “helping optimise and integrate ACE for Games modules into an immersive and dynamic interaction with a non-playable character named Jin,” Nvidia collaborated with Convai (Conversational AI for Virtual Worlds) on this new tech demo. Additionally improving the demo are ray tracing and NVIDIA DLSS 3, which boost performance.

Run Nvidia ACE locally or remotely using the cloud. Nvidia’s NeMo tools for deploying large language models (LLMs), Riva speech-to-text and text-to-speech, as well as other Generative AI technologies, are all included in the ACE package.

The demo showcases Nvidia’s ray-tracing technology and was developed using Unreal Engine 5. Particularly when watched in 4K, the future ramen shop is visually gorgeous and completely immersive.

The demo’s aesthetics are spectacular, but the dialogue between the player and the AI NPC is far from satisfactory. Clearly, the AI character’s tone and facial expressions still need improvement. Sincerity be told, even the player’s voice comes off as artificial and robotic.

The technology does, however, have some intriguing promise. According to The Verge, Nvidia’s VP of GeForce Platform Jason Paul confirmed that the technology may be utilised for numerous characters during the Computex pre-briefing. Theoretically, ACE may permit communication between AI NPCs. Paul confesses he hasn’t seen this theory put to the test.

Nvidia ACE should become more widely used as technology develops and grows. One can easily picture Night City in Cyberpunk 2077 or Los Santos in GTA 5 being populated with NPCs that players may interact with and converse with. The technology will create many new opportunities for creating immersive games and worlds. Additionally, it would result in some interesting tales and original experiences.

source from: msn.com

Please follow and like us:
0
20
Pin Share20