Ninja Theory explains how best to use NaturalMotion's Morpheme animation system
A major part of the gameplay in Enslaved: Odyssey to the West involves lead characters Monkey and Trip interacting with each other, and it was clear early on in development that we wanted perfect synchronisation between our two main protagonists.
Character synchronisation was not something Morpheme supported out of the box, so we worked with the support team at NaturalMotion to come up with a solution that met our requirements:
We needed the synchronisation to be perfect in any individual frame
It also needed to be blend seamlessly, over a small transition time, between synchronised and non-synchronised areas of the Morpheme network
Finally, it had to be optimal, where possible
What we developed was a custom node for Morpheme that was called the Sync Node.
For two characters that were to be synchronised there was always a ‘primary’ and ‘slave’ character. Taking Monkey carrying Trip as an example, Monkey was in that case the primary as he was doing the carrying and actual player-controlled movement, while Trip was the slave character as she simply had to play the equivalent carried animations at the correct weights.
To achieve this, we author the primary character in Morpheme:connect as normal, but we put the area of the network that is to be synchronised inside its own state machine to encapsulate everything. In our example, this was a sub-network for Monkey’s movement while carrying Trip.
For the slave character we place a single sync node inside an equivalent state machine in that character’s network. We give this sync node the same name as the state machine on the primary character, allowing us to match up the state machine with the sync node.
Having both networks’ synchronised areas inside state machines allows us to blend to and from the synchronised areas of the network, thereby meeting our second requirement as well.
The Sync Node
The sync node itself is a very simplistic node in terms of what it actually is and does. It is a node that has lots of inputs for animation nodes on it. For every animation node in the primary character’s synchronised state machine there is the equivalent animation hooked up to the sync node in the slave character’s network. The names of the animation nodes hooked up the sync node match the names of the animation nodes of the primary, again so they can easily be matched up at runtime.
During runtime the sync node is told what animations to play, the time at which to evaluate each one and the blend weight for each animation. The sync node then kicks off a custom task that gets all the required animation inputs evaluated. The task then takes the resulting transform buffers from those evaluated animations and blends them together accordingly, resulting in a synchronised pose for the slave character.
So how do we get those times and blend weights to the slave’s sync node based on what the primary character is doing? There’s a number of things that we have to do for it to all come together and work.
The first thing we have to do is get the two characters evaluated in the correct order. We need to evaluate the entirety of the primary character’s network first, so that everything is up to date for when we extract the animation times and blend weights for the slave. The Morpheme integration with Unreal Engine 3 allows multiple networks to be evaluated in parallel using the PPU and SPUs on PS3, or multiple threads on Xbox 360. It does this by assigning networks to ‘buckets’, where networks in the same bucket will get evaluated on the same SPU, PPU or thread.
We therefore move the slave character into the same bucket as the primary character and make sure the slave comes after the primary in the array of networks to evaluate. This ensures the two synchronised characters are evaluated serially with the primary one done first, while maintaining parallel evaluation for the other characters. When we move the slave into the same bucket at the primary we can also choose to move another active character into the slave’s original bucket for improved load balancing where appropriate.
After synchronisation we move the slave character back to their original bucket and rebalance the other buckets.
When a Morpheme network is evaluated it generates lots of temporary data, such as the sample times for animation nodes, the blend weights for blend nodes, the percentages through transitions etcetera.
For a lot of this temporary data it is deemed only relevant for a particular frame and deleted once the network has been evaluated. However, for the primary character, we need some of the data inside the synchronised sub-network so we can process it to work out the blend weights for the animations, as well as getting the sample times from the animation nodes.
To keep hold of this data, the Morpheme runtime provides a mechanism for registering an interest in pieces of temporary data by saying we want access to that piece of data post-update. After the ‘update connections’ phase of the network evaluation has worked out the active tree of the network, a sync manager in the network goes through any active ‘sync’ state machines in its network and recursively goes down those branches of the active tree registering an interest in the relevant data. For example, if we encounter a blend node in the active tree we register an interest in its blend weight, or if we get to a leaf animation node we register an interest in its sample time.
After this, the primary network is evaluated and upon completion, we then process the sync state machines in the primary. For each active animation node below that state machine we get its sample time from the temporary data and store it. To work out the blend weights we start with a weight of 1.0 and perform a recursive walk through the sync state machine’s children, splitting the weight accordingly if we encounter blend nodes or transitions (again, by reading the temporary data for a blend node’s weight or a transition’s percentage through).
Once we reach an animation node we have the blend weight for that animation and store it. As an optimisation, we can cull animations from the list that have insignificant weights to avoid evaluating and blending them on the slave.
At this point the slave network begins evaluating. When the ‘update connections’ phase encounters the sync node, the sync node copies the stored animation times and blend weights from the primary and stores them as temporary data on itself. Then, as mentioned earlier, during evaluation the sync node’s custom task gets all the required animations evaluated and blends them together. The sync node has then produced a perfectly synchronised pose for the slave.
To summarise, synchronisation was a key part of the gameplay animation requirements and we were pleased with the solution that we arrived at. Customisations like this, as well as all the standard Morpheme features and functionality, allowed us to bring Enslaved to life in the way we envisioned. Going forward, we will continue to work with NaturalMotion on improved character synchronisation.