Leading AI developers use NVIDIA’s suite of technologies to create realistic avatars and dynamic characters for everything from games to healthcare, financial services and retail applications.
GDC—NVIDIA today announced that leading AI application developers across a wide range of industries are using NVIDIA digital human technologies to create lifelike avatars for business applications and dynamic game characters. The results are visible at General Terms and Conditionsthe global AI conference held this week in San Jose, California, and can be seen in technology demonstrations from Hippocratic AI, AI around the world, UneeQ and more.
NVIDIA Avatar Cloud Engine (ACE) for speech and animation, NVIDIA NeMo™ for language, and NVIDIA RTX™ for ray tracing rendering are the building blocks that allow developers to create digital humans capable of interacting in natural language using AI, making conversations more realistic and engaging.
“NVIDIA offers developers a set of world-class AI-driven technologies for digital human creation,” said John Spitzer, vice president of development and performance technologies at NVIDIA. “These technologies can power the complex animations and conversational speeches needed to make digital interactions real.”
World-class digital human technologies
The suite of digital human technologies includes AI-powered language, speech, animation and graphics:
- NVIDIA ACE — technologies that help developers bring digital humans to life with facial animation powered by NVIDIA Audio2Face™ and speech powered by NVIDIA Riva automatic speech recognition (ASR) and speech synthesis (TTS). ACE microservices are flexible and allow models to run on the cloud and on PC depending on the capabilities of the local GPU to ensure the best user experience.
- NVIDIA NeMo — an end-to-end platform that enables developers to deliver enterprise-ready generative AI models with precise data curation and industry-leading customization, generation augmented by recovery and accelerated performance.
- NVIDIA RTX — a set of rendering technologies, such as RTX Global Lighting (RTXGI) and DLSS 3.5which enable real-time path tracking in games and applications.
Building Blocks for Digital Humans and Virtual Assistants
To showcase the new capabilities of its digital human technologies, NVIDIA worked across industries with leading developers, such as Hippocratic AI, Inworld AI and UneeQ, on a series of new demonstrations.
Hippocratic AI created a safety-focused, LLM-powered, task-specific healthcare worker. The agent calls patients on the phone, follows up on care coordination tasks, gives pre-operative instructions, performs post-discharge management and much more. For GTC, NVIDIA collaborated with Hippocratic AI to extend its solution to use NVIDIA ACE microservices, NVIDIA Audio2Face as well as NVIDIA Animation Graphics And NVIDIA Omniverse™ Streaming Client to show the potential of an AI generative health worker avatar.
“Our digital assistants provide useful, timely and accurate information to patients around the world,” said Munjal Shah, co-founder and CEO of Hippocratic AI. “NVIDIA ACE technologies bring them to life with cutting-edge visuals and realistic animations that help better connect with patients. »
UneeQ is an autonomous digital human platform specializing in creating AI-powered avatars for customer service and interactive applications. Its digital humans represent brands online, communicating with customers in real time to give them confidence in their purchases. UneeQ integrated the NVIDIA Audio2Face microservice into its platform and combined it with Synanim ML to create highly realistic avatars for better customer experience and engagement.
“UneeQ combines NVIDIA animation AI with our own Synanim ML synthetic animation technology to deliver real-time digital human interactions that are emotionally responsive and deliver dynamic experiences powered by conversational AI,” said Danny Tomsett , founder and CEO of UneeQ.
Bringing dynamic non-playable characters to games
NVIDIA ACE is a suite of technologies designed to bring game characters to life. Secret protocol is a new technological demonstration, created by AI around the world in partnership with NVIDIA, which pushes the boundaries of what character interactions can be in games. Inworld’s AI engine has integrated NVIDIA Riva for precise speech synthesis and NVIDIA Audio2Face to deliver realistic facial performances.
Inworld’s AI engine takes a multimodal approach to non-playable character (NPC) performance, bringing together cognition, perception, and behavior systems for an immersive narrative with stunning RTX-rendered characters set in an environment beautifully designed.
“The combination of NVIDIA ACE microservices and Inworld Engine allows developers to create digital characters capable of generating dynamic narratives, opening up new possibilities for how players can decipher, infer and play,” said Kylan Gibbs, CEO of Inworld AI.
Game publishers around the world are evaluating how NVIDIA ACE can improve the gaming experience.
Developers in healthcare, gaming, financial services, media and entertainment, and retail are adopting ACE
Top game and digital human developers are pioneering how ACE and generative AI technologies can be used to transform interactions between players and NPCs in games and apps.
Developers and platforms adopting ACE include Convaï, Cyber agent, Data Monsters, Deloitte, Hippocratic AI, IGOODI, AI around the world, Media.Monks, miHoYo, NetEase Games, Perfect world, Open flow, OurPalm, Quantiphi, Rakuten Titles, Slalom, SoftServe, Tencent, Best health technology, Ubisoft, UneeQ And Union avatars.
More information about NVIDIA ACE is available at https://developer.nvidia.com/ace. Platform developers can integrate the full suite of digital human technologies or individual microservices into their product offerings.
Developers can start their journey on NVIDIA ACE by applying to the early access program to get AI models in development. To explore available models, developers can evaluate and access NVIDIA NIM, a set of easy-to-use microservices designed to accelerate the deployment of generative AI, for Riva and Audio2Face on ai.nvidia.com Today.