News & Analysis

NVIDIA and developers pioneer lifelike digital characters for games

9th January 2024
Paige West
0

NVIDIA has recently announced the launch of production microservices for its NVIDIA Avatar Cloud Engine (ACE), which enables developers in the gaming, tools, and middleware sectors to incorporate generative AI models into the digital avatars in their games and applications.

The introduction of ACE microservices empowers developers to create interactive avatars utilising AI models like NVIDIA Omniverse Audio2Face (A2F) and NVIDIA Riva automatic speech recognition (ASR). A2F is designed to generate expressive facial animations from audio sources, while Riva ASR aids in building customisable multilingual speech and translation applications using generative AI.

Several prominent developers, including Charisma.AI, Convai, Inworld, miHoYo, NetEase Games, Ourpalm, Tencent, Ubisoft, and UneeQ, are already incorporating ACE into their development processes.

Keita Iida, NVIDIA's Vice President of Developer Relations, commented: “Generative AI technologies are transforming virtually everything we do, and that also includes game creation and gameplay. NVIDIA ACE opens up new possibilities for game developers by populating their worlds with lifelike digital characters while removing the need for pre-scripted dialogue, delivering greater in-game immersion.”

The adoption of ACE and generative AI technologies is expected to revolutionise interactions between players and non-playable characters (NPCs) in games and applications. For instance, Zhipeng Hu, Senior Vice President of NetEase and head of LeiHuo business group, stated, “For years NVIDIA has been the pied piper of gaming technologies, delivering new and innovative ways to create games. NVIDIA is making games more intelligent and playable through the adoption of gaming AI technologies, which ultimately creates a more immersive experience.”

Similarly, Tencent Games expressed enthusiasm for the advancement, saying: “This is a milestone moment for AI in games. NVIDIA ACE and Tencent Games will help lay the foundation that will bring digital avatars with individual, lifelike personalities, and interactions to video games.”

Historically, NPCs in games have had limited pre-programmed responses and animations, resulting in short, transactional player interactions often overlooked by gamers. With generative AI, these characters can now exhibit more dynamic behaviours and interactions.

Purnendu Mukherjee, Founder and CEO at Convai, remarked: “Generative AI-powered characters in virtual worlds unlock various use cases and experiences that were previously impossible. Convai is leveraging Riva ASR and A2F to enable lifelike NPCs with low-latency response times and high-fidelity natural animation.”

NVIDIA's collaboration with Convai has led to enhancements in the NVIDIA Kairos demo, debuted at Computex, by incorporating new features and ACE microservices. In this updated version, NPCs are more interactive, can converse among themselves, recognise and interact with objects, guide players, and navigate through worlds.

The Audio2Face and Riva automatic speech recognition microservices are currently available, allowing interactive avatar developers to integrate these models into their development workflows.

Product Spotlight

Upcoming Events

View all events
Newsletter
Latest global electronics news
© Copyright 2024 Electronic Specifier