Categories: Technology

Nvidia is using AI to turn game characters into chatbots

/

Tech demos at GDC show how generative AI is being used to voice, animate, and write narratives for video game characters.

If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.

Nvidia is showing off how developers have started using its AI “digital human” tools to voice, animate, and generate dialogue for video game characters. At the Game Developers Conference on Monday, the company released a clip of Covert Protocol, a playable tech demo that showcases how its AI tools can allow NPCs to respond in unique ways to player interactions, generating new responses that fit the live gameplay.

In the demo, players take on the role of a private detective, completing objectives based on conversations with AI-powered NPCs. Nvidia claims that each playthrough is “unique,” with players’ real-time interactions leading to different game outcomes. John Spitzer, Nvidia’s vice president of developer and performance technologies, says the company’s AI tech “may power the complex animations and conversational speech required to make digital interactions feel real.”

Covert Protocol was built in collaboration with Inworld AI, an AI gaming startup, and uses Nvidia’s Avatar Cloud Engine (ACE) technology — the same tech that powered the futuristic ramen shop demo that Nvidia released last May. The new Covert Protocol demo doesn’t show how effective these AI-powered NPCs are for real gameplay, instead showing a selection of clips of NPCs spitting out different voice lines. The line delivery and lip-syncing animations both feel robotic, as though an actual chatbot were talking at you through the screen.

Inworld says it’s planning to release Covert Protocol’s source code “in the near future” to encourage other developers to adopt Nvidia’s ACE digital human tech. Inworld also announced a partnership with Microsoft in November 2023 to help develop Xbox tools for creating AI-powered characters, stories, and quests.

Nvidia’s Audio2Face tech was also showcased in a clip of the upcoming MMO World of Jade Dynasty, which demonstrated a character lip-syncing to both English and Mandarin Chinese speech. The idea is that Audio2Face will make it easier to create games in multiple languages, without manually reanimating characters. Another video clip of the upcoming action melee game Unawake demonstrates how Audio2Face can be used to create facial animations during both cinematics and gameplay.

These tech demos may be enough to convince game developers to experiment with adding AI-powered NPCs to their titles, but at least conversationally, it doesn’t seem like things have progressed much. The characters in Covert Protocol don’t feel any more like “real people” than those in the previous Kairos demos. But that’s unlikely to soothe disgruntled video game voice actors who are concerned about how AI adoption will impact their careers and livelihoods.

https://www.theverge.com/rss/index.xml

Jess Weatherbed

Jess Weatherbed

Share
Published by
Jess Weatherbed

Recent Posts

Hilarious Farming Kid

https://www.youtube.com/watch?v=oazq_Z7gxPA

2 hours ago

Be proud of getting older

https://www.youtube.com/watch?v=OsmjIZkBEpI

2 hours ago

HBO’s upcoming MoviePass documentary is a must-watch for fans of tech trainwrecks

The rise and fall of MoviePass is one of those stories just begging for the…

2 hours ago

What reason do I have to believe this? | @whatever

https://www.youtube.com/watch?v=s3bHLf_U_9Q

3 hours ago

What to expect from Microsoft Build 2024: The Surface event, Windows 11 and AI

If you can't tell by now, just about every tech company is eager to pray…

3 hours ago

Strava is finally adding Dark Mode, AI analytics, family plans and more

Strava revealed an impressive roadmap of features coming to the platform by the end of…

3 hours ago