Via Nexus
-----
The Kinect for Windows team has announced that Microsoft is releasing PC-specific Kinect hardware in 2012, along with a Kinect SDK that will enable developers to build all sorts of apps for Kinect. Will the emergence of motion-based interfaces eventually overtake touchscreens in terms of intuitiveness in computing user interfaces?

Microsoft earlier announced the beta release of the Kinect for Windows SDK, which enabled hackers and enthusiasts to build programs that can take advantage of the Kinect’s motion-sensing capabilities. Just recently, Kinect for Windows GM Craig Eisler announced that the company’s commercial program will launch 2012, which will involve not only software, but also new Kinect hardware specifically designed for personal computers. These will take advantage of computers’ different capabilities, as well as the drawbacks of using the Kinect sensor meant for Microsoft’s Xbox 360 console.
In particular, the benefits of the upcoming hardware will include the following:
- Shorter USB cable to ensure reliable communication between the Kinect sensor and the different computer hardware. A small dongle will be included as a USB hub, so that other USB peripherals can also work alongside Kinect.
- New “Near Mode” that changes the minimum distance from the usual 6 to 8 feet feet down to 40 to 50 cm, so that users sitting in front of their computer can still manipulate elements through movement, even at close range. Firmware optimizations will also make the Kinect more sensitive and responsive. It is hoped that these changes can lead to various new applications like virtual keyboards, hand gestures and even the use of facial expressions to manipulate programs on a computer.
Microsoft is also launching a new Kinect Accelerator business incubator through Microsoft BizSpark, through which the company will actively provide assistance to 10 companies in using the Kinect as a development platform. Microsoft sees businesses and end-users taking advantage of Kinect in more than just gaming. This can include developing special apps for people with disabilities, special needs, or injuries, and can also open the floodgates to other gesture and motion-based UI engineering.
Will Kinect be the next big thing in user experience engineering after multi-touch touchscreens and accelerometers?