Often considered as inferior to visual flair by developers, quality audio design is just as important to delivering an engaging virtual experience. We ask aural authorities for their advice
If you’ve ever attempted to play a game with the sound off, you’ll understand just how important effective audio design is. From the patter of footsteps over pavement to the soft ‘thunk’ announcing that a bullet has found its mark, sound effects are both immersive and informative, complementing gameplay mechanics, narrative and visuals alike.
It’s very easy to point the finger at poor audio design – sword combat would quite literally lose its edge if the clash of blades was accompanied by a dull thud rather than the expected sharp metallic tang – but describing fitting SFX is a harder task.
“Quality starts by getting the universally known sounds right,” offers Ben Minto, senior audio director for Star Wars Battlefront at EA DICE. “This means the sounds for items and actions that we are all familiar with in our day-to-day lives.
“It is also essential to keep the whole audio experience together; keep the sounds existing within the same space, with care taken about how sound travels in that space, how loud and bassy it is, and how it relates to other sounds. A well-structured, consistent and readable language that informs the player enables them to play better, which in turn makes them feel that the game is better.”
CD Projekt Red senior audio programmer Colin Walder agrees that “attention to detail is super important”.
“All the little things add up, even if by themselves they might not seem that significant,” he expands. “That’s what can really bring an interactive world to life. Also, being reactive and responding to context – it sounds kind of obvious but it really is the challenge with interactive audio to go beyond a simple or straightforward implementation and manage to use audio creatively.”
“These last couple of years we’ve put a great emphasis on the interactiveness of the whole soundscape,” elaborates Krzysztof Lipka, senior sound designer at the Witcher III studio. “For us, great quality audio requires tons of small chunks of SFX that are ‘in-between’ and controlled in real time. It’s really important to remember that the less static sounds you do, the better the outcome will be.”
"The fewer static sounds you do, the better the outcome will be."
Krzysztof Lipka, CD Projekt Red
As with any element of game development, the evolution and arrival of increasingly advanced tools has allowed audio designers to do even more with the soundscape of virtual worlds.
“The huge steps in audio tech in the last few years, especially in middleware, has led to more in-game prototyping,” observes Supermassive Games audio director Barney Pratt. “This means we aim to get the sounds into the game much more quickly. Across the industry, linear DAWs are still the staple for producing assets. However, we increasingly prototype and iterate on audio systems at runtime. This is not only nice to do but actually a must-have, due to much closer integration of audio with in-game parameters.”
Regarding his own software selection when imbruing horror game Until Dawn with an atmospheric effects track, Pratt reveals: “The process normally starts in a DAW with an audio pre-vis. Pro Tools is our DAW of choice but there are some increasingly exciting options in the likes of Nuendo and Reaper.
“We always design sounds with the final integration in mind. We’ll record new SFX whenever we can, but also have a library of sounds that is always being added to. Sound Forge is also a staple for various processes around sample editing – the batch process has saved our bacon on more than one occasion. We’ll add the samples to the engine via audio middleware, and then add various mix attributes, in-game controls and other tweaks to get those sounds working well in the game. We try and setup a film-style premix bussing structure, so that when new assets are added they essentially mix themselves.”
Lipka echoes Pratt’s sentiment that an effective batch management tool is indispensible, especially on a title of epic proportions such as The Witcher III.
“In addition to Wwise and our in-house engine tools we use Nugen Audio LMB (Loudness Management Batch),” he says. “It’s brilliant for batch volume management on huge numbers of files. That’s what we’ve used to even out all our VOs in the Witcher series – more than 270k files in total.
“Regarding DAWs, most of us use Reaper with Waves plugins on board – that’s not a rule though. We try to work on the same software and plugins, but that isn’t a hard requirement. Most of the time the sound design is done ITB (In The Box) but occasionally we do use hardware. For recording, we often use AKG 414s, Oktava MK-012 and Sennheisers MKH8040. For monitoring, the whole audio department uses Adam’s A7x and A5x.”
Håkan Dalsfelt is an audio designer and programmer at Coldwood, known for platforming title Unravel.
“The latest generation of consoles gives us a much larger audio processing budget,” he states. “You can play around a lot more with real-time DSPs instead of editing the source files offline. A couple of years ago you would get a knock on the shoulder from an upset programmer if you added some fancy effect to the audio, like a simple reverb.”
While the latest console hardware has opened new avenues for audio designers, Unravel creative director Martin Sahlin says that Coldwood stuck to more traditional methods when producing its twee title’s soundscape.
“We're fairly old-school when it comes to audio and tech,” he suggests. “Most sound effects are from sound libraries, we don't have the resources to record custom sounds for everything and nowadays there's a lot of high quality libraries that are really price-worthy. Håkan recorded the ambience with a Zoom H2n that he brought on hiking and fishing trips. We've always used Fmod as our sound engine, and we’ve got a great in-house level editor that we use for positioning and tweaking the audio. It's nice to have control over a sound all the way from source file to when it's used in-game.”
While the growing ambition of triple-A releases has resulted in a greater workload for audio designers, Minto says games’ post-release lifecycle can actually present as much – if not more – of a challenge.
“It’s very rare these days to produce a gold disc,” he says, referring to the finality of a project. “Patches, DLC and expansions allow the scope and scale of a game to increase exponentially over its lifetime, whilst the base game itself can easily have hundreds of variants of its key item within the game.
“This increase in scope and complexity has driven us to use a more inherited modular system when sound designing, with sounds being constructed at runtime, versus previously being rendered from within a DAW. Using inheritance allows for a tree structured approach so that we can make changes at higher levels which then propagates down to all ‘children’ without having to modify each child patch independently.
“Completing final sound design outside of the runtime environment meant making a commitment whilst making those sounds, whereas by keeping the elements of the sound separate we can alter the content, mixing and processing based on, for example, fully variable physical parameters or more abstract game parameters at runtime. The choice to follow this path for our core sound design has been essential considering we usually ship more content after launch than we do for the initial release.”
"There has been an awakening to the potential of audio as an narrative tool."
Colin Walder, CD Projekt Red
Sound is but one part of a whole, needing to complement a title’s visuals, narrative, themes, mechanics and more. Audio designers must similarly work closely with their counterparts in coding, writing and art to ensure every element harmonises.
“Try to think about your general soundscape in the very early stages of your project,” advises Lipka. “Quite often it helps to know what’s in the designers’ heads and how they picture their elements and the game as a whole. The more interaction between devs, the better.”
Sahlin proposes “forcing your co-workers to enable audio during production”.
“The input you get from colleagues is invaluable, since it's easy to become a bit blind/deaf when you’re working on a large project with lots of unique sound effects,” he warns.
“Communicate with the game team as much as possible,” agrees Pratt. “As audio is often the last discipline in the line, it’s easy to miss important changes and conversations. If you want time to make something sound good you need to pre-empt new game content, especially during the finalling process. Get it right and your sounds can even be in the game before the animation or VFX.”
Building on Pratt’s suggestion that audio can outpace many other aspects of development, Minto outlines a strategy for creators to maximise their pipeline.
“Good audio is ‘cheap’ compared to almost all other disciplines,” he states. “Even so, plan for audio to account for seven per cent of your total man months on a project. Most of these come towards the end of the project, not post-production as in most linear formats.”
He adds that artists working in other areas should be able to communicate with audio specialists regarding all aspects of a project, as audio’s clout can often turn weaker parts into a stronger whole.
“Don’t tell your audio team your solution, explain your problem,” he tells devs. “Audio can sometimes offer a cheap solution to your problem, so make them aware of all areas of the game that aren’t quite coming together that just need that something extra more to get them working.
“Listen. Play the game with the sound on. Try to get everyone on the team to listen. Don’t worry about your feedback being in ‘audio speak’ and don’t try to suggest the solution, just describe how the audio makes you feel or what your expectations are – too much, too little, muddy and so on.”
In line with the growing prominence of emergent gameplay in expansive open-world titles, audio is similarly evolving to reflect the expanding diversity of interactions now possible in virtual experiences.
“Games in general are leaning towards more interactivity in musical design,” explains Marcin Przyby?owicz, principal composer on The Witcher III. “I feel like history is doing a full circle on our watch – there was iMuse, which was the pivotal achievement of interactivity and adaptiveness during the MIDI-based era of game music, now we’re seeing development of the same philosophy in contemporary recorded soundtracks. On top of that, there are experiments with procedurally generated music in games.”
“Dynamic soundtracks are more or less the norm now, which is nice,” adds Dalsfelt. “Obviously that translates to audio playing a bigger part in the storytelling – we can use it to really reflect the gameplay and enhance the experience.”
However, it is an entirely new sector that is especially shaking up the way that sound is implemented and utilised.
“HRTF (Head-Related Transfer Function) systems and binaural encoding are sure to make further impacts on the quality of the audio experience, especially in VR,” predicts Pratt. “Audio processing will be pushing the CPU harder and harder as advances are made in 3D environmental audio. In the same way that impulse response reverbs made a major leap in quality, object-based audio systems will see major advances in 3D reverbs, reflections and refraction.”
“Mixing in VR offers great opportunities to play with the effect of sound proximity to the player. We’ll see an increased attention to detail in terms of sound positioning – and much more complex integration methods – as audio needs to intricately replicate real-world audio behaviours. Audio will be much more closely integrated into the design phase of the project and less of a post-production focus.”
“Everything has to feel just right, and the position of the audio has to be spot on, or there’ll be a big disconnect,” Sahlin cautions of the realism vital to VR’s magic. “Ambience and such will all have to be done in 3D.”
"Instead of always playing catch up to film, now games audio is defining the future for other platforms.”
Ben Minto, EA DICE
Walder is less certain of virtual reality’s transformative effect on audio design.
“I’m not sure that it will change the way we do things much, except that we can focus more on our ideal surround-sound vision of the game,” he states. “In many ways, the tech is still catching up with us.
“The most interesting thing will be custom engine changes and enhancements that teams come up with to find new ways to implement audio. It doesn’t even have to be technologically complex; sometimes I see people come up with a technique that’s really simple but hasn’t be tried before and works really well.”
Lipka agrees: “I’m very skeptic about VR. It’s rather going to be a steady evolution of storytelling through audio.”
Yet, Minto sees virtual reality’s 3D audio as key to bringing games in line with – and then surpassing – the immersive nature of its oft-compared relatives in film.
“VR/spatial audio is something that film and other linear mediums have had to extend into as the technologies come online,” he says. “Whereas, for games, the converse is true. That helicopter overhead in the game world already knows the position where it exists and what sound it should emit from that position, it’s only because of the limitations of the playback systems available to us that previously we have had to compress the 3D soundscape down.
“When first learning about Dolby Atmos and other object-based audio systems, it really did feel like the tide was turning. Instead of always playing catch up to ‘film’, now something that already existed within games was defining the future for other platforms.”
All that’s been discussed so far comes back to a single fact: games are getting better at telling more important stories, in which audio has – and will continue to – played a major role.
“In the industry in general there has been a gradual awakening to the potential of audio as an important narrative tool – that it should be thought of throughout the process of building stories with games rather than just something that gets tacked on at the end,” says Walder. “We’ll start to see more games doing audio that goes beyond simply ‘X event triggers sound Y’ and being really creative with mixing and sound selection to support and even drive narrative.
“I see a great bias towards storytelling through audio,” observes Lipka. “Nowadays more and more games tend to show increasingly complex and mature storylines, full of nonlinearities and even real-life dramas. It’s clear that such an approach forces the industry to put more emphasis on sound as one of the main means of conveying narrative.”
“It’s a very exciting trend because as a byproduct it produces better tools for us sound designers and draws experienced movie sound engineers or composers into games. Recent technical novelties unify movie and game mixing environments greatly, making the transition even less painful over time. Nowadays, it’s not so uncommon to see a great movie composer or sound designer’s name in a major triple-A game’s credits.
“Hopefully in the future such trends will make the industry evolve and push games even further into new regions or directions.”
Barney Pratt, Supermassive: For Until Dawn: Rush of Blood we put together a system of sounds for turbulence on the ears which varied depending on how fast you were travelling and the orientation of the head. The only thing that gave us the sharp, close, air-ripping sound of close-up wind turbulence was the sound of blowing directly onto a mic. Of course, this is something audio guys spend most of their careers trying desperately to avoid.
AS REAL AS IT GETS
Ben Minto, DICE: Back in the summer of 2006, whilst I was still at Criterion, I was working with Chris Sweetman on the pre-production for the audio for Black 2. As a player, you could interrogate a suspect and to drive the point home, you could force your drawn pistol into his mouth.
We couldn’t find anything remotely suitable in the library and faking it wasn’t working. So, I lost the coin toss – and a part of my tooth. We never found the sight that got snapped off the airsoft pistol as it tore into the top of my mouth. We only did it the once and it sounded great, complete with my muffled agony cry and choking.
This story is a great reminder that sometimes there is nothing quite like the ‘real’ sound. It’s also how my wife persuaded me to partially drown myself for a Battlefield 4 scene...
DANCING WITH DEATH
Colin Walder, CD Projekt Red: Synchronising music with combat in Blood and Wine started with a conversation between myself and our composer Marcin about what music features we would like if anything was possible; one of them was to have tight sync between the combat and the music since Geralt is described in The Witcher books as being such an accomplished fighter he looks like he’s dancing.
We went away and came up with a way to do it by syncing the AI attacks to the music. It took some convincing of the AI designers that it wasn’t going to break their combat design, but once we showed them how it would work they got really excited about it too.
In the end we didn’t get to take it as far as we would have liked due to time constraints, but we managed to implement it on a couple of the bosses in Blood and Wine so considering that we started with an ‘impossible’ idea, I think that’s pretty cool.
BURIED IN YOUR WORK
Martin Sahlin, Coldwood: Håkan [Dalsfelt, audio designer] had a bit of a (funny) scare during the Unravel project. He needed to record a zipper sound for a backpack, and we don't really have a proper studio at the office. To find a silent place he decided to use the ‘go fish’ room, where we’ve stored all the empty cardboard boxes, old hardware and other junk that we’ve accumulated over the last couple of years.
When he sat there, he accidentally started a junk-avalanche that made it impossible to open the door from inside. Luckily, there were still colleagues at the office that could come to the rescue.
All this week, Develop is taking a deeper look into sound and music in video games through our Audio Special.