How Frontier is bringing Elite: Dangerous to VR

How Frontier is bringing Elite: Dangerous to VR
James Batchelor

By James Batchelor

April 11th 2014 at 12:15PM

David Braben discusses what the studio is doing to adapt its upcoming game to virtual reality

Why did you decide to add VR support to Elite: Dangerous?
It was something requested by backers, but also something we wanted to do too. We have said from the start that we are writing this game for ourselves, and collectively we were excited by VR.

How did this affect the overall development of the game? What did you have to take into account when adding VR functionality?
It hasn’t especially affected the game’s development because it was such a natural fit. We had to bear in mind that most people would still play the game without VR. From our alpha test, nearly ten per cent of players played using Oculus Rift – it was a surprise to us that it was so high, given the kit that is out now is essentially a developer kit.

When supporting VR we looked at how we could embrace the extra functionality – so things like displays that pop up when you look at them add to the richness of the experience.

You're already reimagining a classic games franchise, but what challenges does reimagining it for VR add?
In my mind at least, I was already sitting in a cockpit and flying a spaceship with both the original Elite and with Frontier. VR brings the reality of it one step closer, so it was not really a challenge; very little ‘reimagining’ was needed.

What learnings could you use from your career so far to make VR development easier?
One of the first steps to VR is stereo vision, and this is something we have supported before at Frontier – in RollerCoaster Tycoon 3 back in 2004 – using red/green glasses. I think the key thing, though, is to make it feel natural. VR is slightly different to passive display devices like TVs, and in some ways is more akin to touch screen, as it is part display, part input device, and it is vital to remember this when making design decisions. 

What learnings actually hindered you? What practices did you have to discard because they didn't apply?
Some things that work fine on a TV screen, like third-person cameras, chase cameras etcetera, make no sense in VR.

Space simulation seems to be a popular genre among the early VR demos?
I think it is a category that makes a lot of sense. The key thing is that the player expects to be seated, so there isn’t the awkward disconnect between avatar motion and player motion that can in some cases make you feel sick. I’m surprised we haven’t seen flight simulators as they should work very well too.

The genre lends itself to motion sickness: is there a danger that this could hold back VR from realising its full potential? What are you doing to compensate?
I think VR can lend itself to motion sickness, but it doesn’t have to. For me, VR works just fine for most of the time, but if my in-game persona’s movement doesn’t match fine, I do feel strange. For example, looking over the balcony in Oculus felt wrong – as I naturally leaned out over the edge, but the avatar didn’t match the motion (though Crystal Cove should help with this). Similarly in the excellent “The Deep” demo on Morpheus, when the shark crashes into the cage and the whole cage is smashed backwards, I didn’t feel any movement as the cage swung, and I felt a bit odd.

In Elite: Dangerous all the player’s motions are reflected in the world. If the ship is knocked a little in a particular direction then the player’s head should not immediately move with it (in a similar way to pulling a tablecloth from under crockery – the crockery doesn’t move), but should then be gradually restored to its position – as that would be what would happen, and gives the player’s inner ear time to accommodate. For example in slow motion car crashes (with crash test dummies) you see that the dummies’ heads keep going in the same direction as the car for a tiny while.

Is there anything you're not quite able to accomplish with VR for Elite that you hope to be able to as the technology evolves?
Sound is a key part of the experience and one of the current challenges is whether to rotate the soundscape with the player’s head, or keep it locked to the ship. Both are arguably correct – depending on whether the player is wearing headphones (when we should rotate it) or the player is listening on a room-based 5.1 or 7.1 system (when we should not – it should stay locked to the ship, in other words in the same direction that the player’s body is facing).

Resolution is also a factor. The change going from the 1280 x 720 display (i.e. 640 x 720 for each eye) of the earlier Oculus headsets to the 1920 x 1080 ones (960 x 1080) is very significant. It really needs the latter to be able to use menus well, and we have designed our interfaces with at least this VR resolution in mind.

Where do you see VR going in the future? Will it become as abundant as smartphones, as people have suggested? Why/why not?
I do see smartphones migrating into wearable tech, and this is a part of it, but it requires applications to migrate well too. It could well become ubiquitous, with ever better combined VR/AR as Google have already shown, but that will take a good few years to fully mature.

In the short-term I see VR resolutions increasing further – to perhaps 4k (ie ~2k x 2k per eye) – and better accommodation for people who normally wear glasses.