NVIDIA’s AI has already changed gaming, and it’s still heating up

Since introducing the GeForce RTX 20 Series GPUs and their built-in Tensor Cores, NVIDIA has pushed the possibilities of AI in gaming to the fore. More features and more games take advantage of AI to deliver their game worlds and high-quality graphics. And with the latest NVIDIA ACE technology, it’s clear that NVIDIA has plans to make AI even more integral to gaming than before.

The move to AI started with NVIDIA DLSS, Deep Learning Super Sampling, which allows players to enjoy high-fidelity visuals without the typical hit-to-frame rates that come with them. NVIDIA achieved this by using higher quality graphics as AI training data so that the Tensor Cores could play low-resolution gameplay and understand how to bridge the gap. Now DLSS multiplies resolution and frame rate with AI.

Similarly, NVIDIA Freestyle and RTX Remix offer tools to enhance the visuals of games with AI. Freestyle allows users to apply filters or even upscale the display to HDR in real-time for non-HDR games to improve their experience. Meanwhile, RTX Remix can do for in-game assets what DLSS did for output resolution, allowing modders to take asset files and smartly convert them into higher quality assets to produce fast remasters of classic games.

While DLSS, NVIDIA Freestyle and RTX Remix are all changing the visual experience of games, NVIDIA ACE is intended to change the way people interact with games themselves.

NVIDIA ACE puts an AI toolset behind in-game NPCs (non-player characters). Just as the power of large language models has brought chatbots to life in profound ways, NVIDIA ACE can bring a new level of dynamism to NPCs, helping gamers find more engagement during their games.

NVIDIA ACE’s Riva speech and Audio2Face animation models enable seamless interaction between players and NPCs. Riva takes voice prompts from players and transcribes them for the game. AI-powered NPCs respond in natural language that can be processed by Audio2Face to synchronize facial animations with the in-game character’s speech. Developers can run this entire pipeline, from user input to NPC response, in the cloud or even locally on the user’s PC.

NVIDIA NeMo provides pre-trained models and frameworks that allow developers to create their own language models that can be used to create the brains behind NPCs. Choose from pre-trained language models called Nemotron and create guardrails (the safety system used to keep language models on track and appropriate) using NeMo Guardrails. The frameworks can be used to easily optimize, tune, and deploy these language models to NVIDIA GPUs in the cloud.

As complicated as it seems, AI will play a huge role in gaming in the future, and NVIDIA ACE is part of that. To help you understand the latest developments, NVIDIA has created the AI decoded seriesthat you can check out to stay up to date on what’s in store at the intersection of gaming and AI.

Related Post