No longer just a programming niche, deferred rendering is becoming an increasingly popular technique on consoles tooâ?¦As an option in the armoury of the programmer, deferred rendering has been around for a while. Released in 2007, PC shooter S.T.A.L.K.E.R. was one of the first commercial titles to make use of the technique.
Developer GSC Game World explained it was the ideal choice in the case of that particular game because of its lower geometry and pixel-processing requirements, and lower CPU overhead, compared to a traditional forward shading engine.
Something else that further encouraged its adoption has been the work carried out by Sony’s internal R&D team to standardise deferred rendering as something that can be used on PlayStation 3.
As well as being widely distributed within Sony studios, the results of this labour have also found their way into some high profile multi-format games. Collaboration with Rockstar resulted in its use in the RAGE technology that powers GTA IV, for example. Media Molecule’s LittleBigPlanet and Guerilla’s Killzone 2 are two Sony-backed games that make the most out of the additional control and sophistication it enables in terms of game lighting.
“Because you project your lights into the scene as a post-process, you’re not lighting any pixels that are hidden behind any other pixels,” says Jan-Bart van Beek, art and animation director at Guerilla, describing one of the advantages that convinced the studio make the early decision to use deferred rendering in Killzone 2.
He points out there some subtle advantages in terms art process too. “Because you take all the lighting calculations out of your shaders, it makes them a lot less complicated. This means your artists can create the shaders, not programmers. We used Maya’s shading editor to make our game shaders. And because the cost of these shaders is low, you can create specific looks for specific objects instead of having to use general templates.”
Of course, the headline advantage of deferred rendering remains the number of lights you can use. “Effectively you can have an infinite number of lights as opposed to about four in a normal shader, because the cost is related to the number of pixels you’re lighting, not the number of lights,” van Beek says. For example, the heaviest scenes in Killzone 2 involves several hundred lights.
There’s no such thing as a free lunch, however, and the inherent disadvantages of deferred rendering mean it won’t replaced standard forward rendering anytime soon.
“The biggest visual issue with deferred rendering is the lack of a really good, universal approach for integrating translucencies,” says Dag Frommhold of German middleware provider Trinigy. The company is currently working on a deferred element for its Vision engine, which will provide clients with a hybrid solution that seems to be the only way to solve the problem.
Guerilla’s van Beek says a similar approach was used in Killzone 2, with a forward renderer introduced to handle transparency effects such as water. Similarly, it uses an offscreen renderer to deal with particle effects, which are then composited back.
Tim Sweeney, architect of the Unreal engine, highlights further visual challenges of using deferred rendering. “It’s faster for large numbers of lights and shadows, but the drawbacks are increased video memory usage, and artistic limitations as you force all objects to be rendered with the same material model,” he says “Unreal Engine 3 has an extremely flexible and artist-extensible material system, so we didn’t want to constrain this unnecessarily.”
Another issue is anti-aliasing. “Anti-aliasing is a key to the rendering quality of Gears of War,” Sweeney explains. “If you look closely, you’ll see that all static and dynamic lighting is anti-aliased with multisample anti-aliasing, so moving to a pure deferred rendering approach would be a step backward.”
Significantly though, UE3 does use some deferred elements, re-using z- and colour-buffers and techniques that would be otherwise impractical, such as velocity-buffered motion blur. And Sweeney is happy to concede that future hardware architectures might encourage the further use of deferred rendering. “I expect we’ll see developers inventing ever-cooler deferred techniques,” he says. “But the constraints assure that it won’t become the predominant rendering scheme, at least within this console generation.”