Intel Developer Blog: Softtalkblog looks at HTML5, augmented reality and perceptual computing
The Game Developers Conference 2013 proved to be a great week for the coder community. With the Intel Perceptual Computing Developer Day, new developer tools and a number of contests and challenges, devs were treated to the latest technologies designed to help them meet their potential.
Perceptual computing allows human interactions to be recognised by machines via streamlined control mechanisms including facial recognition, voice commands and gesture swiping. It provides a responsive experience tailored to the user’s needs.
As perceptual computing is an increasingly hot topic among developers, it’s hardly surprising that the release of the production version of Intel’s Perceptual Computing SDK at GDC got tongues wagging.
When used alongside the Creative Interactive Gesture Camera, devs can use the SDK to fully integrate perceptual computing into next generation Ultrabooks and PCs. And, excitingly for the coder community, it is readily available for anyone looking to build a commercial app with those capabilities.
The Intel Perceptual Computing Developer Day was held on the first day of the San Francisco conference. It gave coders the chance to try out the SDK for themselves. A room full of developers enjoyed the close-range hand and finger tracking, speech recognition, facial analysis, and augmented reality functions - and began planning next-gen apps for Windows 8 and Ultrabooks.
For an insight into the possibilities of perceptual computing, Intel brought Treasure Cave, a previously internal-only test-run programme, along to GDC. It created quite a stir, and demonstrated how a camera can essentially translate reality into the virtual world.
A few weeks ago, I looked at the Ultimate Coder Challenge contestants, who explored all of this during the seven week competition. Seven developers had seven weeks to create apps that utilised the new capabilities of the Intel Perceptual Computing SDK.
With their industry-leading experience and knowledge, all seven coders made their presence known at GDC; both in the Intel booth and giving presentations on Perceptual Computing at the GDC Theatre.
Evidently, the interactions between a computer and its user have moved beyond the traditional mouse-and-keyboard set up. The gaming potential for this technology is immense – and clear.
But let’s think further than this. What about biometrics? Could perceptual computing be used to read a face for security purposes, for example? How about leveraging perceptual computing alongside speech and augmented reality? Let me know what potential you see in perceptual computing in the comment box below.
HTML5 was another prominent factor at GDC, especially in regards to the new HTML5 Development Environment. As a cloud-based, cross-platform HTML5 application development interface, it provides a faster turnaround for the development process.
Coders can use the HTML5 Development Environment to build an app and host it on a number of software platforms quickly. It’s easy to use, free to get started, and everything is based right within the web browser. You can create, test and debug your app from within the Web browser. You can read more about getting started with it here.
This is just a small selection of the lessons learned at GDC 2013. There are lots of PDFs of talks, tech sessions and workshops here.
HTML5, augmented reality and perceptual computing inspired the coders at GDC. How do you see them affecting your future apps?
This blog post is written by Softtalkblog, and is sponsored by the Intel Developer Zone, which helps you to develop, market and sell software and apps for prominent platforms and emerging technologies powered by Intel Architecture.