A realtime game audio primer

A realtime game audio primer

By Scott Simon

July 20th 2012 at 10:00AM

iZotope's Scott Simon offers advice on realtime audio processing; an approach vital to Forza 4

Creating an immersive and cinematic gaming experience is on the mind of every producer. In many ways, the game industry parallels the movie industry, with huge studio budgets, massive development teams, and sophisticated tools.

As with any Hollywood movie, one of the most critical steps in creating a game is the process of matching the audio effects to their respective on screen, visual representations.

Also like Hollywood, when it comes to audio, each game has its own set of challenges due to the scarce resources available after all the visual components are realised. One option for publishers is to use realtime audio Digital Signal Processing (DSP) as a way to dynamically improve the audio’s realism, enhance intensity, and up the gaming experience.

The advantage of realtime audio processing is its ability to generate a greater variety of sounds (vital to engaging gameplay) and enhance the sound in response to what's happening in the game. Real-time processing can also extend the limited number of audio samples by giving each sample more dimensions, thereby reducing your audio budget.

Diving In

How does this work in practice? It all begins at the creative level.  In pre-production, audio leads assemble concepts on how to engage the audio experience. An audio lead or audio director is typically given a spreadsheet that outlines memory and CPU budgets.

To remain within that budget, audio departments have been historically tasked with cutting large amounts of files, often reducing the number of sound file variations. To keep the game running efficiently, one might be asked to stream a single sound from the game’s disk as opposed to the 20 originally planned.

From a creative standpoint, those audio asset variants had a purpose. Without realtime processing, an audio lead might be asked to randomise parameters such as pitch or volume across multiple sound files in an attempt to create more realistic gameplay.

The problem here is that multiple samples loaded simultaneously occupy more memory and cannot respond dynamically while the game is being played. The game audio lead has to anticipate every possible sound. Here is where real-time audio DSP can help. New technology enables designers to add filters, distortions, and reverbs to react to actions in real-time and create a more cinematic experience.

Let's take a step back and look at what a recorded sound is – a static snapshot of a sound source at a given time. Before there was realtime audio DSP, you would determine and tailor how you wanted that sound to take shape.  For example, you would record the sound of rain required for a scene that takes place in a jungle. When your character ducks under an umbrella for cover, your rain should sound different, right?

Of course. Sound designers would take that rain file and create multiple variants to help sell the scene. This means for one scene, you might have a dozen different rain files loaded into the system’s memory. Realtime audio DSP allows you to render one single file and apply real-time filtering, reverb, distortions, etc.

Let's Talk Tools

As middleware engines have become ubiquitous for many facets of game development, audio middleware has played a key role. Not only have audio middleware engines created a unified toolset, but the platform also opened the market place for 3rd party audio developers. Manufacturers like iZotope, McDSP, and AudioEase are developing widely used in-game tech, the likes of which had previously only been reserved for post-production desktop systems.

Let’s explore what’s available and how they help.  Distortions are a guitarists dream, but also a key asset for sound designers.  Many of us know distortion as the sound that made Jimi Hendrix famous, but it's much more.  Getting that 'walkie-talkie' effect or boom box radio sound is easily created with distortion.

Imagine this scene – you’re playing a first person shooter and your squad leader is giving you tactical instructions.  You are in the huddle, your team leader is directly in front of you, but when you head off to your assignment you're given instructions by way of walkie-talkie. No need to recreate and render dialogue tracks as the game experience changes. Make use of the real-time parameter controls for distortions and/or box modellers to give the desired effect.

Space

Reverb refers to the way in which sounds reflect off of surrounding objects. The brain is brilliant at decoding this information and giving cues as to where you are in relation to that object. Today, audio teams have amazing tools called 'convolution reverb'. To simplify, these software algorithms physically model a given area and seamlessly place your object within that space.

What problems does this solve in-game? Haven't you always wanted to hear what your voice sounds like inside the Grand Canyon or Wembley Arena? Fewer files often mean better memory allocation, but it also results in more accurate gameplay.

As your character leaves Wembley Arena and enters the Grand Canyon, real-time parameter changes help to create a smooth transition and make gameplay more interactive.

When it's crunch time for developers and the team adds a new object to a scene that needs coinciding sounds, real-time DSP can give your audio lead another lever to pull so she or he can respond quickly & make new sounds out of existing ones. The realtime audio tools available in middleware engines such as Audiokinetic's Wwise and Firelight’s FMOD Studio are commonplace to the audio engineer.

Today with the transition to realtime audio DSP at the game level, we no longer have to live in the past. Manufacturers like iZotope and McDSP are helping studios perfect the in-game experience in many ways, from reducing memory usage and CPU cycles, to helping studios like Turn10 faithfully recreate dozens of amazing vehicles for their racing games.

Exciting audio experiences that match Hollywood style epics are no longer a concept of the future.

Manufacturers like iZotope and McDSP are helping studios perfect the in-game experience in many ways, from reducing memory usage and CPU cycles, to helping studios like Turn10 faithfully recreate dozens of vehicles for the studio's racing games, as found in Forza Motorsport 4.