The
new SDK for the Kinect sensor is now available for free download. This
brings with it a whole ton of upgrades for those who want to make their
computers more aware of their surroundings. There are new sensor modes
and all kinds of good stuff. There is also a sizeable gallery of sample
programs which you can just play with. This makes it worth a look even
if you don’t intend to write any programs for the sensor, but just want
to get a feel for the kinds of wonderful things it can do.
The
highlight, which I’m really looking forward to playing with, is “Kinect
Fusion”. This lets you use the sensor as kind of hand held 3D scanner.
You wave the Kinect around a scene and the program will build up a 3D
model of what is in front of it. You’ll need a fairly beefy graphics
card in your PC to make it work quickly (it uses the power of the GPU to
crunch the scene data), but the results look really impressive. I’m
really looking forward to printing little plastic models of me that I
can give as Christmas presents…
Here is a Kinect-like device from ASUS, which proposed a SDK based on openNI (standard libraries that manage natural interactions issued by different kind of devices). This product can be an interesting alternative to the Microsoft Kinect device for research projects, proposing examples in C++, C# and Java.
Specifications
Power Consumption
below 2.5W
Distance of Use
Between 0.8m and 3.5m
Field of View
58° H, 45° V, 70° D (Horizontal, Vertical, Diagonal)
Sensor
RGB& Depth& Microphone*2
Depth Image Size
VGA (640x480) : 30 fps QVGA (320x240): 60 fps
Resolution
SXGA (1280*1024)
Platform
Intel X86 & AMD
OS Support
Win 32/64 : XP , Vista, 7 Linux Ubuntu 10.10: X86,32/64 bit Android(by request)
Finger mouse: 3Gear uses depth-sensing cameras to track finger movements.
3Gear Systems
Microsoft's Kinect, a 3-D camera and software for gaming, has made a
big impact since its launch in 2010. Eight million devices were sold in
the product's first two months on the market as people clamored to play
video games with their entire bodies in lieu of handheld controllers.
But while Kinect is great for full-body gaming, it isn't useful as an
interface for personal computing, in part because its algorithms can't
quickly and accurately detect hand and finger movements.
Now a San Francisco-based startup called 3Gear has developed a
gesture interface that can track fast-moving fingers. Today the company
will release an early version of its software to programmers. The setup
requires two 3-D cameras positioned above the user to the right and
left.
The hope is that developers will create useful applications that will
expand the reach of 3Gear's hand-tracking algorithms. Eventually, says
Robert Wang, who cofounded the company, 3Gear's technology could be used
by engineers to craft 3-D objects, by gamers who want precision play,
by surgeons who need to manipulate 3-D data during operations, and by
anyone who wants a computer to do her bidding with a wave of the finger.
One problem with gestural interfaces—as well as touch-screen desktop
displays—is that they can be uncomfortable to use. They sometimes lead
to an ache dubbed "gorilla arm." As a result, Wang says, 3Gear focused
on making its gesture interface practical and comfortable.
"If I want to work at my desk and use gestures, I can't do that all day," he says. "It's not precise, and it's not ergonomic."
The key, Wang says, is to use two 3-D cameras above the hands. They
are currently rigged on a metal frame, but eventually could be clipped
onto a monitor. A view from above means that hands can rest on a desk or
stay on a keyboard. (While the 3Gear software development kit is free
during its public beta, which lasts until November 30, developers must
purchase their own hardware, including cameras and frame.)
"Other projects have replaced touch screens with sensors that sit on
the desk and point up toward the screen, still requiring the user to
reach forward, away from the keyboard," says Daniel Wigdor, professor of computer science at the University of Toronto and author of Brave NUI World, a book about touch and gesture interfaces. "This solution tries to address that."
3Gear isn't alone in its desire to tackle the finer points of gesture
tracking. Earlier this year, Microsoft released an update that enabled
people who develop Kinect for Windows software to track head position,
eyebrow location, and the shape of a mouth. Additionally, Israeli
startup Omek, Belgian startup SoftKinetic, and a startup from San Francisco called Leap Motion—which
claims its small, single-camera system will track movements to a
hundredth of a millimeter—are all jockeying for a position in the
fledgling gesture-interface market.
Small
satellites capable of docking in orbit could be used as "space building
blocks" to create larger spacecraft, says UK firm Surrey Satellite
Technology. Not content with putting a smartphone app in space, the company now plans to launch a satellite equipped with a Kinect depth camera, allowing it to locate and join with other nearby satellites.
SpaceX's Dragon spacecraft is the latest of many large-scale vehicles to dock in space,
but joining small and low-cost craft has not been attempted before.
Surrey Satellite Technology's Strand-2 mission will launch two 30cm-long
satellites on the same rocket, then attempt to dock them by using the
Kinect sensor to align together in 3D space.
"Once
you can launch low cost nanosatellites that dock together, the
possibilities are endless - like space building blocks," says project
leader Shaun Kenyon. For example, it might be possible to launch
the components for a larger spacecraft one piece at a time, then have
them automatically assemble in space.