NVIDIA ACE Technology: Transforming Digital Avatars with AI

2024-01-10 02:30:00

At today’s CES 2024 conference, NVIDIA released NVIDIA ACE (Avatar Cloud Engine) technology, which allows game, tool and middleware developers to apply generative artificial intelligence (AI) models to digital avatars of games and applications.

With the support of this technology, ordinary NPC characters can be transformed into dynamic and interactive characters that can initiate conversations or guide players to find new tasks.

Previously, NVIDIA partnered with startup Convai to demonstrate how game developers can soon use NVIDIA ACE for Games to create NPCs. Convai, which focuses its business on developing cutting-edge conversational artificial intelligence for virtual games, uses ACE modules in their end-to-end real-time avatar platform.

NVIDIA said that through ACE microservices, game and digital entertainment content developers can build interactive digital avatars using AI models such as NVIDIA Audio2Face (A2F), which can create expressions from sound sources, and NVIDIA Riva automatic speech recognition (ASR). Rich facial animations, and the latter enables the development of customized multilingual speech and translation applications for use by digital avatars.

Through this service, game developers will be able to create realistic characters for the game world without having to write dialogues in advance, making players more immersed in the game world. Developers currently adopting ACE include miHoYo, NetEase Games, Ourpalm, Tencent, Ubisoft, UneeQ, Charisma.AI, Convai, Inworld, etc.

Players can pass RIVA ASRReal-time conversation with AI characters using microphone,The result is immersive effects and dynamic conversations.

The A2F model can achieve real-time generation of facial expressions and synchronized mouth movements through audio only, and can also be used to create facial expressions offline.

NVIDIA said that game developers and startups have begun to use NVIDIA’s generative artificial intelligence technology in their workflows.

  • GSC Game World, one of the well-known game developers in Europe, uses Audio2Face in the upcoming “TALKER 2: Heart of Chornobyl”.
  • Independent game developer Fallen Leaf used Audio2Face to create facial animations for characters in FORT SOLIS, a third-person science fiction thriller action game set on Mars.
  • Use artificial intelligence to create ai for virtual characters, and use Audio2Face to create animations in the dialogue engine.

1704953999
#NVIDIA #ACE #technology #released #assist #generating #virtual #digital #avatars #Txnet

Leave a Replay