nDreams CEO Patrick O'Luanaigh discusses what metrics to analyse when making a virtual reality game

Virtual Analytics: Tracking user behaviour in VR

At nDreams, like many studios, we use analytics to help develop our games. But because we’re working exclusively on VR titles, we are able to explore how analytics works in virtual reality, and play with some new methods that only VR allows.

We track performance metrics to provide accurate and timely feedback on our efforts to optimise our levels for a VR experience, and to provide objective feedback about the more fuzzy side of development; the player experience and game design.

We’ve been using analytics in one form or another for the duration of the development of The Assembly, and have also used analytics to provide us with design feedback on our GearVR titles, Perfect Beach and Gunner. Analytics gives us hard data to use when making decisions about where we need to concentrate our efforts, and provides a history of the project through a time-line of performance metrics for each of the levels.

The danger of assumptions

VR design is quite different to traditional game design, and challenging for those of us who think we know how to make games. Knowing that you cannot rely on years of game development experience to pick out a perfect and simple solution means we have to rely on objective measures in order to be realistic about any assumptions we may have made. This can be difficult.

There have been times when things that are obvious to us – and work when we play the game without a headset on – suddenly become strange, disorienting, or just overlooked once you put on the headset. For example, we found that people were not activating items in the world after they became available to activate, because they were locked out while voiceover was being played.

For some reason, when people were wearing the HMD, they would take the fact that the item was not interactive during voiceover, and assume that the item was not interactive at all. Discovering this took time, and analytics were helpful in proving this was occurring. We added this to our design tool kit and now, instead of locking out interactive objects, we prefer to keep them hidden, or make the lock itself visible and tangible. Otherwise, we make it very obvious that interaction is available through some visible change to the interaction point, such as lighting up a button when it is possible to use.

Gaze data

As The Assembly is a content rich game, and content takes a lot of time to generate, no studio has a bottomless budget for content, so we needed to make sure we were spending our budget wisely. Analytics can be used to determine what should be prioritised using heat maps generated from gaze data. It gives the content creators an idea of what areas of the game are looked at the most, and thus what should be given the most attention.

Using this data, developers can ensure these areas are of a consistent quality, and are given the most strict performance tests. Gathering this data can be time consuming without analytics, but even with analytics, a lot of the data will be noisy, and prone to situational effects such as people putting the HMD down and leaving it looking directly at something not at all important.

We can also use this to see if players are noticing key items and elements in particular levels, or walking by, completely oblivious to them!

Moving to VR means the rules change, so we have to be sure of things that most people take for granted. Notifications or goals that are normally shown on the user interface have to be placed differently. Having an analytic for when the notification was fired, and another for when it was acted upon, can show that when no-one is chaperoning the experience, game events can “go missing”.

In VR, visual notifications can become overlooked. Capturing this type of design flaw, and other VR design issues has been difficult with analytics. Mostly, we have caught these issues by quiet observation of players, but an analytic for when someone looks down and another for whenever there is an on-screen prompt to press a button could have been informative. People still look down to see where on the pad they need to be pressing, even though they aren’t able to see the pad.

We found that when people were playing in VR, their experience was more drawn out. Simple analytics for level start and each of the sub goals present an objective timeline of gameplay. We found that people spent a lot of time just looking around in VR, leading to some decisions to make places that added very little to the gameplay, but offered more spectacle, and they were warmly received – players love to explore.

Analytics cannot replace user testing, as they don’t capture the feelings of the players, and in fact, they should be used in conjunction when possible, to prove that players did or did not see the tricks that we employ. Sometimes user testing is the only way of capturing something that happens out of the blue, such as a player having a very vocal reaction to a situation, or capturing why they gave up on something.

In these early days of VR, talking with players after they have tested the game is important for design considerations at the lowest level. It’s a good idea to have your own rapid turnaround on new analytics events and on getting data back. We created our own system so we could analyse data as it was coming in. Once a game is published, we will rely on third-party analytics solutions, but having the flexibility to add a new analytic, generate a new query, and get a report on it all in under an hour, is valuable to us.

Checking frame rates

Most people in VR are highly concerned by performance – hitting a consistent frame rate in every scenario and at every possible location in your environment is vital. Having a core set of frame timing metrics is important for any studio working with a game with as much content as The Assembly.

It would be impossible for a tester to work through the whole game and gaze at all the different areas and note down frame-rates, so we set up an automated profiling service that plays through the game and records the frame metrics such as CPU time, GPU time, draw calls, and other engine specifics. These are all fed back into an internal page that our content creators use to decide how they will approach fixing any performance issues in their levels. This mirrors performance profiling on many traditional titles, but we had to modify our process to capture these metrics for VR on machines without HMDs attached, but still trying to get as close to reality as possible.

In the beginning we used a baseline of trying to aim for double the frame-rate, but now we use a fake HMD mode that puts the same stress on the CPU and GPU by forcing a large field of view, rendering twice, and doing post-render distortion. We use the data from the traditional rendering runs along with the data from the stereo render tests to build a profile of what affects stereo rendering the most for our game. Now we have a set of project specific guidelines on what things cost, and best practices for achieving the best results given our target platforms.

Analytics and metrics are essential to making a polished game, and VR is no different in this respect. In these early days, user testing and debriefing players is still a highly valuable exercise. Knowing what to analyse in VR can be tricky, as there are many metrics that experienced developers no longer think about that turn out to be critical. But VR also offers new opportunities to track where players are looking, what catches their attention, how they’re moving their hands, and so on. We hope that taking advantage of these new opportunities will make a real difference as we move forwards.

About MCV Staff

Check Also

The shortlist for the 2024 MCV/DEVELOP Awards!

After carefully considering the many hundreds of nominations, we have a shortlist! Voting on the winners will begin soon, ahead of the awards ceremony on June 20th