Kinect is a motion sensing input device by Microsoft for the Xbox 360 video game console and Windows PCs. Based around a webcam-style add-on peripheral for the Xbox 360 console, it enables users to control and interact with the Xbox 360 without the need to touch a game controller, through a natural user interface using gestures and spoken commands. The project is aimed at broadening the Xbox 360's audience beyond its typical gamer base. Kinect competes with the Wii Remote Plus and PlayStation Move with PlayStation Eye motion controllers for the Wii and PlayStation 3 home consoles, respectively. A version for Windows was released on February 1, 2012

Why another Kinect report after Greg Borenstein has recently published a book on Kinect, Making Things See

There are 3 reasons:

1. I would like to focus more on actual software library issues - comparing latency issues with various existing libraries + possible future updates(ex.skeleton without user recognition phase, mouth tracking and finger tracking) + wish lists.

2. By the time the book was being written, the sound localization detection + voice recognition libraries were not open to public. It is a matter of time for it to be updated and there are much research going on among Kinect hackers and I would like to discuss about these possibilities.

3. A lot of working examples and codes optimized for ITPiers

1st project Skeleton Tracking analysis XYZ position joints graph

Synapse: Works way better than OSCeleton on 10.6 at least:

What is Synapse? http://synapsekinect.tumblr.com/post/6305020721/download

Synapse is an app for Mac and Windows that allows you to easily use your Kinect to control Ableton Live, Quartz Composer, Max/MSP/Jitter, and any other application that can receive OSC events. It sends joint positions and hit events via OSC.

what is OSCeleton? https://github.com/Sensebloom/OSCeleton

OSCeleton is a proxy that sends skeleton information collected from the kinect sensor via OSC, making it easier to use input from the device in any language / framework that supports the OSC protocol.

My first attempt to draw XYZ of every joint parts to see the accuracy and responsiveness of Kinect Skeleton Tracking' All programmed in MAX6:

Download the actual patch here: Kinect Graph

inside of p body.joints : simply double-click on the patcher

inside of p joint.hits: simple double-click on the patcher in the patcher you can see whether current joint has moved UDLRFB(Up, Down, Left, Right, Front, Back) and based on the movement the yellow light will brink on the assigned location

The depth map is visualized here using color gradients from white (near) to blue (far)

This infrared image shows the laser grid Kinect uses to calculate depth

The Kinect sensor is a horizontal bar connected to a small base with a motorized pivot and is designed to be positioned lengthwise above or below the video display. The device features an "RGB camera, depth sensor and multi-array microphone running proprietary software", which provide full-body 3D motion capture, facial recognition and voice recognition capabilities.

What is OpenNI?

OpenNI is an open source API that is publicly available at http://www.OpenNI.org.

OpenNI (Open Natural Interaction) is a multi-language, cross-platform framework that defines APIs for writing applications utilizing Natural Interaction. OpenNI APIs are composed of a set of interfaces for writing NI applications. The main purpose of OpenNI is to form a standard API that enables communication with both: Vision and audio sensors (the devices that ‘see’ and ‘hear’ the figures and their surroundings.) Vision and audio perception middleware (the software components that analyze the audio and visual data that is recorded from the scene, and comprehend it). For example, software that receives visual data, such as an image, returns the location of the palm of a hand detected within the image.

Using the Kinect Depth Camera with OpenCV

Sam Muscroft has successfully incorporated the raw depth, rgb depth map and rgb output from the Kinect sensor into an OpenCV project using the Windows CL NUI Platform.

RGB output as you'd expect is stored in an 8 bit 3 channel matrix. Depth needs to be stored in a 16 bit 1 channel matrix.

He found the easiest way to output the data (depth & RGB) was to create an image header of the appropriate bit depth and number of channels and populate with the data returned from the open-source Kinect API.

How Motion Detection Works in Xbox Kinect

Camera Kinect’s camera is powered by both hardware and software. And it does two things: generate a three-dimensional (moving) image of the objects in its field of view, and recognize (moving) human beings among those objects.

Older software programs used differences in color and texture to distinguish objects from their backgrounds. PrimeSense, the company whose tech powers Kinect, and recent Microsoft acquisition Canesta use a different model. The camera transmits invisible near-infrared light and measures its “time of flight” after it reflects off the objects.

Time-of-flight works like sonar: If you know how long the light takes to return, you know how far away an object is. Cast a big field, with lots of pings going back and forth at the speed of light, and you can know how far away a lot of objects are.

Using an infrared generator also partially solves the problem of ambient light. Since the sensor isn’t designed to register visible light, it doesn’t get quite as many false positives.

PrimeSense and Kinect go one step further and encode information in the near-IR light. As that information is returned, some of it is deformed — which in turn can help generate a finer image of those objects’ 3-D texture, not just their depth.

OpenKinect in Processing

For those of who simply want to get things going with kinect and processing

following link will be a good starting point

http://www.shiffman.net/p5/kinect/

Also some information about OS X Lion+MAX 5+OpenNI+OSCeleton.

The latest CMake update works perfectly with OS X Lion and MAX 5. http://www.cmake.org/cmake/resources/software.html

The latest MacPorts update works perfectly with OS X Lion and MAX 5. http://www.macports.org/

The latest OpenNI update works perfectly with OS X Lion and MAX 5.

http://www.openni.org/downloadfiles/opennimodules/openni-binaries/20-latest-unstable OpenNI Unstable Build for MacOSX 10.6 Universal x86/x64 (32/64-bit) v1.3.2.3

http://www.openni.org/downloadfiles/opennimodules/openni-compliant-middleware-binaries/33-latest-unstable PrimeSense NITE Unstable Build for MacOSX 10.6 Universal x86/x64 (32/64-bit) v1.4.1.2

Another information for option outputs !the joint rotation data! you will find here : https://github.com/rabidgremlin/OSCeleton-Puppet

OSCeleton uses 26% CPU on MAC Mini.

the update avin2/SensorKinect

https://github.com/avin2/SensorKinect