With recent improvements to hardware and software, plus an increasing emphasis on real-time, interactive performance capture, we could be on the precipice of a golden age for motion capture technology. Jem Alexander investigates recent progressions in mo-cap technologies

Motion CaptVRe

Motion capture, and the way it is used in game development, is improving rapidly. No longer used solely by animators to record and store an actor’s performance, the technology is expanding into new areas. Those working with it on a daily basis are excited to see where this might lead.

Technology’s inevitable march forward means that motion capture can now occur in realtime, with an actor’s movements being instantly reflected in a game. Not only does this benefit animators by streamlining their process, but it also opens doors to other applications, like virtual reality. The HTC Vive, Oculus Rift and PlayStation VR all take advantage of motion capture technology to allow players to interact with virtual worlds. Not as advanced as that used in an animation studio, perhaps, but with time this can only improve.

“The biggest advance in mo-cap, in my opinion, is linked to the VR push,” says Alexandre Pechev, CEO of motion capture middleware provider IKinema.

“With the advances in VR hardware, motion capture technologies have moved to our living rooms and offices. Mo-cap will inevitably become part of our everyday life.”

Motion capture studio Audiomotion’s managing director, Brian Mitchell, agrees. “The fact we can stream live data through to game engines has had a massive effect. Matched with VR, this means developers can really let loose with their creativity,” he says.

It seems that only recently have all the many individual steps in motion capture technology culminated in a great leap forward for the industry.

“I think the biggest development this year has been the jump in combined technologies and approaches in realtime full performance capture,” says Derek Potter, head of product management at Vicon Motion Systems. “There’s been steady progress over the past five years in the area of realtime full performance capture, however what we’ve seen previously is progress on single fronts.

“What we find really exciting about the past year is seeing different developments coming together. It feels like this is moving these types of captures from being investigative to being more fully realised and production ready.”

It’s not just the games industry that is enjoying these developments. Motion capture is the same wherever it is used and everyone is learning from one another. “Different industries use the same game engines for realtime rendering, the same mo-cap hardware and the same solving and retargeting technologies for producing the final product – animation in realtime,” says IKinema’s Pechev. “We are already at a point where the entertainment sector is sharing hardware, technologies, approaches and tools.”

Dr. Colin Urquhart, CEO of Dimensional Imaging, sees this as a good thing for everyone. Especially since some areas of the entertainment industry have been at it a little longer than us. “The use of helmet mounted camera systems for full performance capture was pioneered on movie projects such as Avatar and Planet of the Apes, but is now becoming widespread for use on video game projects,” he says. “People see how effective this technology is in movies and expect the same effect in a video game.”

Full performance capture like this, which simultaneously records body and facial movements, leads to much more realistic actions and expressions. Something that can really affect your immersion in a world or your feelings for a character. Peter Busch, VP of business development at Faceware Technologies, a motion capture company that focuses on facial animation, says that “characters in today’s games are already pushing the realism envelope when it comes to facial performances, but that will only increase with time. Look for more realistic facial movement and in particular, eye movement, in the games of tomorrow“.

“It’s one thing to watch an animated character in a game,” Busch continues. “It’s quite another to interact with one. Today, we’re able to interact with characters animated in realtime via live performances or kiosks at theme parks. It’s rudimentary, but it’s effective.

“Tomorrow, we’ll be able to interact with player-driven characters or AI-driven avatars, in game, in realtime. Imagine saying something to a character in a game and having them respond to you as they would in real life. This will change the face of games.”

Virtual reality will be a huge beneficiary of these improvements to facial animation, as developers scramble their way out of the uncanny valley. Classic NPCs feel significantly more like dead-eyed mannequins within a VR environment and improvements in this area could go a long way towards truer immersion and deeper connections with characters and worlds.

“The key to an engaging experience, especially in a new medium such as VR, is connecting users with a powerful story and characters by drawing upon the emotional, character-driven, and nuanced performance from the faces of the actors,” says Busch. “[With today’s technology] studios are able to capture the entire facial performance, including micro expressions, and display those subtleties in convincing ways.”

Vicon’s Derek Potter is in agreementand suggests that improvements in motion capture could mitigate the effect of the uncanny valley, if not remove it completely. “Nothing immerses like the eyes,” he says. “The one thing that each new generation of console has provided is more power. More power means better, more realistic visuals. Motion capture provides the ability to transpose ‘truer’ movements onto characters.

“The mind is a brilliant machine and one of the things that it does amazingly well is to let us know when something ‘doesn’t look quite right’. This lessens the immersiveness of the gaming experience. What excites us about the next generation of consoles is the increasing ability to render realistic graphics, combined with the motion capture industry’s progress in capturing finer and more realistic movements to let us all sink a little deeper into the game. I think this is amazing, both as someone working in motion capture and as a gamer.”

As advances continue to be made to motion capture technology and software, we’re seeing a drop in price for mo-cap solutions. “I think it’s easier for developers to incorporate motion capture into their projects than ever,” says Dimensional Imaging’s Urquhart.

“Mo-cap used to be a tool used exclusively by big studios or developers on big budget projects. This is not the case anymore. There are many tools on the market for capturing an individual’s face and body performance. Mo-cap tech doesn’t have to be a premium product.”

This accessibility helpfully comes at a time when even small indie developers might be looking into motion capture solutions, thanks to the recent release of consumer virtual reality devices. “VR controllers act as a realtime mo-cap source of data,” says IKinema’s Pechev. “Using this to see your virtual body reacting to your real movements changes completely the quality of immersion.”

“Using mo-cap to see your own avatar allows you to actually feel the environment as well as see it,” says Audiomotion’s Mitchell. “It really does turn your stomach when you see your own foot moving closer to the cliff edge.”

With motion capture now more powerful, accessible and interactive than ever, where does the industry go from here?

“The demand for mo-cap is definitely going to increase over the next few years and games projects will require better and better acting talent and direction to meet this demand,” says Dimensional Imaging’s Urquhart.

“Improving on the realism means increasing the fidelity of motion capture, particularly of facial motion capture. Using techniques such as surface capture instead of marker based or facial-feature based approaches can help with this.”

Faceware’s Peter Busch has plans in mind to solve the issues inherent in facial motion capture, particularly for VR. “The next frontier is creating even more realistic and immersive experiences across the board, including new experiences like realtime user-driven VR characters that are realistic in every aspect, right down to the user’s facial expressions,” he says.

“Current cameras are insufficient for capturing true social VR. There is no way to capture a VR user’s entire face while playing VR. Our interactive team is actively developing hardware and software solutions that we’re planning to launch this coming year.”

Vicon’s Derek Potter believes there’s still plenty of work for providers to do to make mo-cap the best it can be. “I think there are three way that providers can help in continuing to push motion and performance capture forward,” he says. “[Companies] like Vicon need to continue to do what we’ve been doing steadily each year, which is to continue to improve the fundamentals of the motion data provided. The accuracy, reliability, overall quality and speed of the data are all vital.

“Secondly, as motion capture matures as a technology, we need to see it not only as its own entity but also as part of a bigger machine. We need to continue our efforts to integrate and provide more open access to the data provided by the system. The third thing we need to do is to listen to some bright lights in the field who have been pushing technics forward recently. The technology needs to be pliable enough to fit the new and evolving needs that performance capture demands.“

Meanwhile, IKinema’s Alexandre Pechev won’t be happy until motion capture technology is perfected. “[I want] better and simpler (ideally non- invasive) types of motion capture combined with realtime solvers that automatically improve issues and deliver perfect animation,” he says, ending with some blue sky thinking that in the past would have been a better fit for a sci-fi novel than a game development magazine.

“My dream is to see a mo-cap system that reads brainwaves to deliver the muscle movements of all bones. Maybe one day…”

About MCV Staff

Check Also

The shortlist for the 2024 MCV/DEVELOP Awards!

After carefully considering the many hundreds of nominations, we have a shortlist! Voting on the winners will begin soon, ahead of the awards ceremony on June 20th