The Flash way
Some background info: With as3kinect you can access the skeleton and depth map data from OpenNI directly in Flash. You can then track 14 dots of the body in real time. What you don't get yet are gestures (exception: Touch gesture). So we had to do some custom math to recognize things like "swipe", "push", "pull" and so on. There is the middleware NITE which intends to detect these gestures, but it's gesture detection feature is not used in as3kinect (yet).
There is an amazing video demonstrating a browser plugin called DepthJs. But this was all we found that was done with open web techniques. So what we did: David and Dan built a page with HTML5 WebSockets and the jQuery HTML5 Carousel. The client simply connects to the server and listens for messages like "push", " swipeleft" and "swiperight".
For the server side, we wanted to use Processing and thus needed a Java binding of the Kinect signal. We found Processing Simple-OpenNI and added the "push" and "swipe" detectors. To install it, you first need the Kinect drivers, OpenNI and NITE. We were successful with the guide of SensorKinect.
To provide the data over WebSockets, we needed a WebSocket implementation for the Java side. On a first attempt, we found only bloated stuff that brought Tomcat or even JBoss into the game and would have make it pretty complicated to just open a socket connection. After wasting some time trying to "quickly" implement it ourselves (the handshake is really pretty freakish) we finally found the Java-WebSocket library which perfectly suited our needs.
The code we created is available on github.
Here is a shaky demo video:
Edit: Asus will soon bring a NUI-device for PCs to market. Thx for the hint @sandro
Add a comment
Your email adress will never be published. Comment spam will be deleted!