Intel Developer Blog: Softtalkblog looks at what developers can learn from making the way we interact with devices more natural

Learning from the perceptual computing pioneers

As users demand a more intuitive and immersive experience from their devices, perceptual computing is at the forefront of many developers’ minds. The idea of Ultrabooks, smartphones and tablets being able to perceive a user’s actions through hand gestures, speech recognition and face tracking is at an exciting stage; and seven contestants taking part in the Intel Ultimate Coder Challenge are learning lessons on behalf of the industry, figuring out what works – and what doesn’t.

One contestant, Lee Bamber, wants to rewrite the rule book on teleconferencing software. Dissatisfied with the apps available, he’s looking to develop the first of a long line of high fidelity, super-accurate perceptual devices. As it stands, his app brings the person out of their bedroom, and sticks them in a virtual boardroom. Between coding sessions lasting more than 30 hours, Lee has given us some great personal insights into his journey. When using voice control in your perceptual computing app, he suggests using longer, more unique words to relate to core commands. This is to avoid any confusion from the app. He notes: “Using a word like ‘Return’ can produce a zillion variants, but a word like ‘Conference’ only returns one or two variants. Choosing the right command words in your app is essential if you want to avoid frustrated users.”

Eskil Steenberg of Quel Solaar is taking on a big issue in the Ultimate Coder Challenge. He is writing a piece of open source software that makes it easier for developers to use the diverse hardware available to them. He wants to make it possible for hardware vendors to experiment with new hardware configurations, without forcing developers to rewrite apps in order to take advantage of them.

Contestant number three, Peter O’Hanlon, is creating a gesture and touch-driven photo editing app called Huda. While he doesn’t have high hopes for it being finished by the end of the competition, a new take on a classic and popular type of app holds a lot of long-term potential. This is an app that will be especially interesting for Ultrabook users, seeing as they can use touch to ‘accomplish pretty much everything’, according to Peter. He’s been keeping the abilities available to the user in mind here, which is important to remember at every stage of any development process.

The Sixense team is made up of Danny, directing the project; Chip the designer; Dan, providing the art for the app; and Ali, the code wrangler. Their app, a story game with a wolf and three little pigs (Surfer Pig, Lumberjack Pig and Suburban Pig) is coming together with intricate scenes and cute characters with real popularity potential. Judge Sascha Pallenberg points out that having an attractive character really helps people connect with your app.

Infrared5 is developing its game Kiwi Catapult Revenge, and acknowledges face tracking as one of its biggest challenges. Having faced a number of hiccups along the way, it will be interesting to see the similarities and differences between their initial vision of the game, and the actual end app.

The Simian Squared team is creating a virtual pottery wheel, allowing users to use their hands to sculpt digital clay into beautiful works of art. Physical gestures and motions will mould, manipulate and paint the clay. An interesting challenge for the team has been recreating the textured clay itself – but tackling in this problem, they discovered that perceptual computing isn’t about sticking exactly to reality. Bending the rules of perspective, light or shadow for impact can make the app feel better and (funnily enough) more natural for the user.

Finally, Code Monkeys have taken their soon-to-be-launched game, Stargate Gunship, and are using the Ultimate Coder Challenge to make it into a perceptual game, fully immersing the player. Their initial aims of adding hand controls for firing, voice commands to switch weapons and gaze capture for targeting have provided some interesting challenges along the way. For instance, when deciding on which gestures to use for firing, they realised they needed to keep the gesture physically comfortable. When testing, the team felt that holding their arms in the air to shoot ended up quite tiring, so they are seeking an alternative action.

Watching their video blogs, the Code Monkeys team were asking all the right questions from the word ‘Go’: What do we want from perceptual computing? What do we expect from it? And most importantly, what do we want that we don’t expect? In perceptual computing, vision doesn’t just mean eye tracking.

We’re nearing the end of the Challenge, and one team or contestant will soon be crowned Ultimate Coder. Who do you expect to win? And what lessons have you learned by following them?

About MCV Staff

Check Also

The shortlist for the 2024 MCV/DEVELOP Awards!

After carefully considering the many hundreds of nominations, we have a shortlist! Voting on the winners will begin soon, ahead of the awards ceremony on June 20th