Develop discovers whatâ??s next for the future of user interface design and development
There are few elements that govern gameplay as resolutely as user interfaces. They are the frameworks which give users the means to interact with computer entertainment.
Done right, they can elevate surprisingly simple games to the echelons of Top Ten lists and make the classics, years after their original release, feel as fresh as the day they first arrived.
But done wrong, the reputation of a game can be irreparably damaged. No amount of beautiful character animations or bombastic weapon effects can stand up to a vocal group of players frustrated by patchy or uninspired UI implementation.
The influx of motion control technology on consoles has again put user interface and experience at the heart of game design and development challenges.
And it’s not only new interfaces on consoles that developers must think about. Touch-screen computers, such as smartphones and tablets, present a whole new frontier of learning to be undertaken by today’s UI experts and game designers.
To get a sense of what the future holds for UI technology in games, Develop spoke to several UI tools specialists and developers about its strengths, challenges and trends.
One of the co-founding problems of user interface and user experience are the multiple definitions attached to them.
For the purposes of this feature, we will be approaching user interface and experience from the position of inputs and outputs and anything that a user has to do to interact with the system; be that at a hardware level with buttons, gesture or touch control; or software one with game menus, GUIs, HUDs, gameplay control, in-game videos tutorials and so forth.
UI THEN AND NOW
Cast your mind back to the 1980s and the birth of home consoles. Then, games consoles, such as the NES and Master System, came with controllers that were limited to six buttons which gave games designers a slim palette with which to build entire user experiences around.
Fast-forward to the present day, and user interface has become the main battleground for the hardware titans of the games business and the wider electronics industry.
Nintendo was the first to make serious gains with gesture-based interfaces with the launch of the Wii – the family-friendly console that sought to break down the barrier to entry caused by ‘overcomplicated’ gamepads.
The addition of Sony’s PlayStation Move and Microsoft’s Kinect has ensured that gesture control is here to stay. The latter had the news media convinced that the Star Trek-style Holodeck of the future was just around the corner for home entertainment.
Clearly, those predictions were a tad overzealous. Still, phenomenal progress has been made in UI design in recent times.
Kinect and other motion control devices represent a sizeable shift in how users interact with entertainment. But the hardware itself is only the beginning. Nicholas Atamas, a senior programmer at Epic Games, says the significance of UX must be accepted more widely among developers if we are to see advancements.
“The biggest challenge is user experience literacy,” he says. “Many games developers come into the industry saying, ‘I want to work on user-facing stuff’. When you ask them for specifics, you almost never hear excitement about building user experiences, traditional or otherwise. A counted few have immersed themselves in the wealth of really valuable UX information that is out there.
“Many programmers think it is somehow beneath them, but when you ask them to design a simple UI on paper, they cannot do it. No amount of tech – short of an AI that designs the game’s user experience for you – is going to overcome that.”
The issue that Atamas raises about UX literacy is one reason why middleware tools facilitating the creation of user interfaces have become a necessity for many in the development community.
Scaleform is one of the most popular solutions used in over one thousand products worldwide. Autodesk acquired it in 2011 for a reported $36 million. Autodesk Gameware’s senior manager of product marketing Greg Castle says he has witnessed the “increased adoption” of Scaleform as gesture and voice control grow in popularity.
The UI maker is a good barometer for where the wind is blowing in this often overlooked sector of game development. The job of UI tools providers has never been more important as developers hurry to find their feet with the latest user inputs.
Naturally, one of the key expectations from middleware is the flexibility to handle a multitude of emerging user interfaces. Alas, multi-platform, multi-discipline UI tools cannot automatically solve all UI challenges, as Crytek senior programmer Dean Classen explains.
“From the design side, to ensure a great interface on every platform and device, the ideal solution would be to design a completely unique UI for each," offers Classen.
"Unfortunately, the investment required to achieve that is usually not feasible so we need to find the best overlap and re-use as much as possible.
“From the programming side of things, one of the main challenges is managing all the specific code for each platform and developing it in a way that is flexible enough not to require a major change every time interface design changes – which happens all the time.”
Knowing what you can achieve with the tools you have and the time afforded to you are the deciding factors. The way users interact with input devices, software menus and gameplay itself must be carefully and thoroughly considered.
Methods Crytek employs to keep its projects – and its clients' – up-to-date include providing a game library, the results from organised focus tests and third-party reviews of the current product it’s working on. Obvious though it may sound, listening to user feedback about UI is something to take seriously, particularly those working on the newest forms of UI technology.
LIGHTS, CAMERA, INTERACTION
One of the problems that the emerging user interfaces - such as gesture and touch control - present is a lack of recognised standards.
It’s taken developers years to master gamepad inputs. But over time controllers themselves have become more ergonomic and familiar control configurations have been adopted as cross-platform standards.
Experienced players are the audience most comfortable with these standards. They are used to gamepad inputs changing depending on genre, the context of the gameplay and the camera perspective they are experiencing it from.
Atamas agrees that familiar interfaces serve users’ needs best.
“Sometimes the best UI is one that is really familiar to most people; it is built out of familiar components and it gets out of your way while subtly calling your attention to all the right information,” he says.
“Incidentally, these are often the cheaper types of UIs to build well.”
The differences between Kinect, Move and the Wii are broad enough to make the creation of universal UI standards seem somewhat laughable. And Autodesk’s Castle stresses another point that complicates the switch to gesture and touch interfaces.
“The biggest challenge we see is adopting UI for various screen sizes and development platforms,” he states.
“The UI of a game on a TV with a controller versus a mobile with a touch screen is very different, and if you’re going to build a framework on which to develop UI, it needs to have ever increasing variety and flexibility.”
However, Player Research founder Graham McAllister believes the breaking standards should perhaps be encouraged when it comes to gameplay.
“If standards for motion and gesture controls were to exist, they must describe the best possible methods for interaction," he says.
"To get there requires detailed analysis of not only controller interaction, but also people. Without these standards, studios may design less-than-perfect solutions which may cause repeating frustration, and push players away from the core game experience."
The challenges for gesture technologies are numerous, but in the grand scheme they are another step towards more seamless technology that is both ergonomic in design and better integrated into our lives.
For the immediate future, UI experts at Epic, Autodesk and Crytek are in agreement that one technology is emerging as the next phase of UI development.
“The most prominent emerging technology is wearable displays,” offers Epic’s Atamas.
“The promise of a low-power, high-resolution, personalised display is going to make augmented reality very appealing. It will be really interesting to see the UI paradigms that emerge to support augmented reality.
"I expect that in the not-too-distant future we will be able to view all our work and gaming on a single wearable display device. While the UI paradigms will be largely some evolution of existing ones, we are going to be spending even more time looking at UI.”
Castle asserts that wearable technology will “cause a significant disruption in how we interact with gaming and technology at large”. He adds: “It’s early to identify exactly how this will manifest, but when game developers can register movements as inputs without a sensor you open up a whole new realm of possibilities for mobile gaming.”
Looking further into the future, another technology that is already being experimented with is brain control interfaces. NeuroSky and Emotiv are two of the companies that have produced headsets which read the user’s concentration levels and brainwaves and interpret them into inputs commands.
Crytek senior programmer Classen sees this as the most exciting development along with head-tracking VR goggles, which have been implemented in racing titles and action games.
“They’re getting more responsive and easier to integrate into small projects all the time,” he says enthusiastically.
Brain control is still at a point where its impressive the technology even works, but the same can’t be said for motion control.
“The introduction of touch-less 3D interfaces was acclaimed as a huge revolution, but the magic that we all saw in those technologies didn’t hide the limitation forever,” surmises Jeremie Kletzkine, business development manager at PrimeSense.
“The lack of precision is a show-stopping restriction for many potential applications.”
Plentiful are the avenues for UI. But Kletzkine’s comments and others here are a reminder that the original promise of gesture control is still yet to be realised.
With the number of digital platforms still expanding, the consolidation of UI tools could indeed help developers. But equally there are those that feel pre-baked solutions and standardised control configurations will hamstring creativity.
Multi-touch, gesture control, brain control… the technology is evolving faster than people can become familiar with it.
UI will continue to fascinate and captivate, but, as Kletzkine says, the true challenge is in making the technology feel invisible.