Tuesday, March 19, 2013

Kinect and Leap Motion in Love

Natural User Interfaces are a trend since a few years and the development of new hardware technology for affordable prices helps to push that movement into the consumer market. The Kinect is one example, the soon-to-be-available Leap Motion is another one or the announced MYO device.

I'm one of the lucky people who applied and actually got a Leap Motion developer device before its release. I also have a Kinect for Windows. Both devices use computer vision with various sensors combined with artificial intelligence like machine learning for the object recognition. The Kinect for Windows recognizes skeletons with joints in a distance of  80 - 400 cm or 40 - 300 cm (Near Mode) which means the Kinect is perfect for the mid range distance but not for close PC interaction. The Leap Motion on the other hand is more for the close range and detects hands, fingers and pen-like objects very precisely. That's why I wanted to combine them and build a proof of concept hybrid solution that leverages both Kinect and Leap Motion to get the best of both worlds (close and mid distance).

In the video below I demo my proof of concept which is based on the Kinect for Windows SDK 1.7 and its Interactions and Controls components, plus the Leap Motion developer device and its beta SDK.
By the way, the Kinect for Windows SDK 1.7 is now publicly available and the bits are nothing less than awesome.




In case you are wondering about the infrared interference of the Kinect and Leap Motion. There's some, but I actually didn't notice any serious effect on the recognition capabilities of the Kinect nor the Leap.
Below is a picture of the Leap Motion as seen through the Kinect's infrared stream. The Leap Motion has three light sources but those are not very bright.



Some might say the Leap Motion is a competitor to the Kinect for Windows, I see it more like a nice addition to our developer toolset and the beginning of a story: Kinect ♥ Leap Motion.

1 comment: