Reports.Kinect History

Hide minor edits - Show changes to output

Changed lines 22-31 from:
to:
Synapse: Works way better than OSCeleton on 10.6 at least:

What is Synapse? http://synapsekinect.tumblr.com/post/6305020721/download

Synapse is an app for Mac and Windows that allows you to easily use your Kinect to control Ableton Live, Quartz Composer, Max/MSP/Jitter, and any other application that can receive OSC events. It sends joint positions and hit events via OSC.

what is OSCeleton? https://github.com/Sensebloom/OSCeleton

OSCeleton is a proxy that sends skeleton information collected from the kinect sensor via OSC, making it easier to use input from the device in any language / framework that supports the OSC protocol.
Deleted line 139:
Added lines 105-131:


Also some information about OS X Lion+MAX 5+OpenNI+OSCeleton.

The latest CMake update works perfectly with OS X Lion and MAX 5.
http://www.cmake.org/cmake/resources/software.html

The latest MacPorts update works perfectly with OS X Lion and MAX 5.
http://www.macports.org/

The latest OpenNI update works perfectly with OS X Lion and MAX 5.

http://www.openni.org/downloadfiles/opennimodules/openni-binaries/20-latest-unstable
OpenNI Unstable Build for MacOSX 10.6 Universal x86/x64 (32/64-bit) v1.3.2.3

http://www.openni.org/downloadfiles/opennimodules/openni-compliant-middleware-binaries/33-latest-unstable
PrimeSense NITE Unstable Build for MacOSX 10.6 Universal x86/x64 (32/64-bit) v1.4.1.2

Another information for option outputs !the joint rotation data! you will find here :
https://github.com/rabidgremlin/OSCeleton-Puppet

OSCeleton uses 26% CPU on MAC Mini.

the update avin2/SensorKinect

https://github.com/avin2/SensorKinect
Changed lines 95-105 from:
PrimeSense and Kinect go one step further and encode information in the near-IR light. As that information is returned, some of it is deformed — which in turn can help generate a finer image of those objects’ 3-D texture, not just their depth.
to:
PrimeSense and Kinect go one step further and encode information in the near-IR light. As that information is returned, some of it is deformed — which in turn can help generate a finer image of those objects’ 3-D texture, not just their depth.


'''OpenKinect in Processing'''

For those of who simply want to get things going with kinect and processing

following link will be a good starting point

http://www.shiffman.net/p5/kinect/
Changed line 28 from:
[[here:http://dl.dropbox.com/u/3084507/Sensor%20workshop/KinectXYZgraphTrigger.maxpat | Kinect Graph]]
to:
here: [[http://dl.dropbox.com/u/3084507/Sensor%20workshop/KinectXYZgraphTrigger.maxpat | Kinect Graph]]
Changed line 28 from:
here:http://dl.dropbox.com/u/3084507/Sensor%20workshop/KinectXYZgraphTrigger.maxpat
to:
[[here:http://dl.dropbox.com/u/3084507/Sensor%20workshop/KinectXYZgraphTrigger.maxpat | Kinect Graph]]
Changed line 9 from:
!! Why another Kinect report after Greg Borenstein has recently published a book on Kinect, [[http://shop.oreilly.com/product/0636920020684.do | Make Things See]]
to:
!! Why another Kinect report after Greg Borenstein has recently published a book on Kinect, [[http://shop.oreilly.com/product/0636920020684.do | Making Things See]]
Changed line 9 from:
!! Why another Kinect report after Greg Borenstein has recently published a book on Kinect, [[http://shop.oreilly.com/product/0636920020684.do]
to:
!! Why another Kinect report after Greg Borenstein has recently published a book on Kinect, [[http://shop.oreilly.com/product/0636920020684.do | Make Things See]]
Changed line 7 from:
(:end lang:)
to:
(:sourceend:)
Changed lines 4-7 from:
to:
(:source lang=arduino:)


(:end lang:)
Added line 31:
Changed lines 33-34 from:
'+in the patcher you can see whether current joint has moved UDLRFB(Up, Down, Left, Right, Front, Back) and based on the movement the yellow light
will brink on the assigned location+'
to:
'+in the patcher you can see whether current joint has moved UDLRFB(Up, Down, Left, Right, Front, Back)+'
'+and
based on the movement the yellow light will brink on the assigned location+'
Changed line 29 from:
inside of p joint.hits : simply double-click on the patcher
to:
'+inside of p body.joints : simply double-click on the patcher+'
Added lines 31-34:
'+inside of p joint.hits: simple double-click on the patcher+'
'+in the patcher you can see whether current joint has moved UDLRFB(Up, Down, Left, Right, Front, Back) and based on the movement the yellow light
will brink on the assigned location+'
Changed lines 7-19 from:
]


!!
There are 3 reasons:

!! 1. I would like to focus more on actual software library issues - comparing latency issues with various existing libraries + possible future updates(ex.skeleton without user recognition phase, mouth tracking and finger tracking) + wish lists.

!! 2. By the time the book was being written, the sound localization detection + voice recognition libraries were not open to public. It is a matter of time for it to be updated and there are much research going on among Kinect hackers and I would like to discuss about these possibilities.

!! 3. A lot of working examples and codes optimized for ITPiers


'+1.project+'
to:
There are 3 reasons:

1. I would like to focus more on actual software library issues - comparing latency issues with various existing libraries + possible future updates(ex.skeleton without user recognition phase, mouth tracking and finger tracking) + wish lists.

2. By the time the book was being written, the sound localization detection + voice recognition libraries were not open to public. It is a matter of time for it to be updated and there are much research going on among Kinect hackers and I would like to discuss about these possibilities.

3. A lot of working examples and codes optimized for ITPiers


'+1st project Skeleton Tracking analysis XYZ position joints graph+'
Changed lines 18-19 from:
!! So expect to see a lot of codes being updated very soon!!!!
to:
'+1.project+'
Changed lines 21-22 from:
'''My first attempt to draw XYZ of every joint parts to see the accuracy and responsiveness of Kinect Skeleton Tracking'''''
to:
'''My first attempt to draw XYZ of every joint parts to see the accuracy and responsiveness of Kinect Skeleton Tracking''''
Changed line 20 from:
''
to:
Deleted line 11:
Changed lines 20-23 from:
!! '''My first attempt to draw XYZ of every joint parts to see the accuracy and responsiveness of Kinect Skeleton Tracking

ALl programmed in MAX6:
'''
to:
''
'''My first attempt to draw XYZ of every joint parts to see the accuracy and responsiveness of Kinect Skeleton Tracking'''''

All programmed in MAX6:
Changed lines 21-24 from:
'+My first attempt to draw XYZ of every joint parts to see the accuracy and responsiveness of Kinect Skeleton Tracking

ALl programmed in MAX6:
+'
to:
!! '''My first attempt to draw XYZ of every joint parts to see the accuracy and responsiveness of Kinect Skeleton Tracking

ALl programmed in MAX6:'''
Changed lines 21-22 from:
My first attempt to draw XYZ of every joint parts to see the accuracy and responsiveness of Kinect Skeleton Tracking
to:
'+My first attempt to draw XYZ of every joint parts to see the accuracy and responsiveness of Kinect Skeleton Tracking
Changed line 24 from:
to:
+'
Added line 30:
inside of p joint.hits : simply double-click on the patcher
Added lines 27-28:

http://dl.dropbox.com/u/3084507/Sensor%20workshop/programOverview.png
Added lines 24-26:

Download the actual patch
here:http://dl.dropbox.com/u/3084507/Sensor%20workshop/KinectXYZgraphTrigger.maxpat
Changed lines 26-29 from:

http://dl.dropbox.com/u/3084507/Sensor%20workshop/kinect%20graph2..png

http://dl.dropbox.com/u/3084507/Sensor%20workshop/kinect%20graph3.png
to:
http://dl.dropbox.com/u/3084507/Sensor%20workshop/jointTrigger.png
http://dl.dropbox.com/u/3084507/Sensor%20workshop/GraphDetail.png
http://dl.dropbox.com/u/3084507/Sensor%20workshop/KinectGraphAll.png
Changed lines 29-31 from:
http://dl.dropbox.com/u/3084507/Sensor%20workshop/kinect%20graph3..png
to:
http://dl.dropbox.com/u/3084507/Sensor%20workshop/kinect%20graph3.png
Added lines 26-29:

http://dl.dropbox.com/u/3084507/Sensor%20workshop/kinect%20graph2..png

http://dl.dropbox.com/u/3084507/Sensor%20workshop/kinect%20graph3..png
Changed line 25 from:
http://stu.itp.nyu.edu/home/bhl236/Sensor%20Workshop/kinect%20graph1.png
to:
http://dl.dropbox.com/u/3084507/Sensor%20workshop/kinect%20graph1.png
Changed line 25 from:
sftp://bhl236@stu.itp.nyu.edu/home/bhl236/Sensor%20Workshop/kinect%20graph1.png
to:
http://stu.itp.nyu.edu/home/bhl236/Sensor%20Workshop/kinect%20graph1.png
Changed line 25 from:
http://stu.itp.nyu.edu/home/bhl236/Sensor%20Workshop/kinect%20graph1.png
to:
sftp://bhl236@stu.itp.nyu.edu/home/bhl236/Sensor%20Workshop/kinect%20graph1.png
Changed line 25 from:
Attach:00.png
to:
http://stu.itp.nyu.edu/home/bhl236/Sensor%20Workshop/kinect%20graph1.png
Changed line 25 from:
Attach:kinect graph1.png
to:
Attach:00.png
Changed lines 25-31 from:
attach:kinect graph1.png

to:
Attach:kinect graph1.png
Changed line 76 from:
PrimeSense and Kinect go one step further and encode information in the near-IR light. As that information is returned, some of it is deformed — which in turn can help generate a finer image of those objects’ 3-D texture, not just their depth.
to:
PrimeSense and Kinect go one step further and encode information in the near-IR light. As that information is returned, some of it is deformed — which in turn can help generate a finer image of those objects’ 3-D texture, not just their depth.
Changed lines 23-26 from:
ALl programmed in MAX6
to:
ALl programmed in MAX6:

attach:kinect graph1.png
Added lines 20-25:

My first attempt to draw XYZ of every joint parts to see the accuracy and responsiveness of Kinect Skeleton Tracking

ALl programmed in MAX6
Changed lines 10-12 from:
!! There are 2 reasons:
to:
!! There are 3 reasons:
Added lines 18-19:

!! So expect to see a lot of codes being updated very soon!!!!
Changed line 17 from:
!! 3. A lot of working examples and codes suitable for ITPiers
to:
!! 3. A lot of working examples and codes optimized for ITPiers
Added line 17:
!! 3. A lot of working examples and codes suitable for ITPiers
Changed line 15 from:
!! 2. By the time the book was being written, the sound location detection + voice recognition libraries were not open to public. It is a matter of time for it to be updated and there are much research going on among Kinect hackers and I would like to discuss about these possibilities.
to:
!! 2. By the time the book was being written, the sound localization detection + voice recognition libraries were not open to public. It is a matter of time for it to be updated and there are much research going on among Kinect hackers and I would like to discuss about these possibilities.
Changed lines 13-14 from:
!! 1. I would like to focus more on actual software library issues - comparing latency issues with various existing libraries + possible future updates(ex.skeleton with out body recognition phase, mouth tracking and finger tracking) + wish lists.
to:
!! 1. I would like to focus more on actual software library issues - comparing latency issues with various existing libraries + possible future updates(ex.skeleton without user recognition phase, mouth tracking and finger tracking) + wish lists.
Changed line 14 from:
2. By the time the book was being written, the sound location detection + voice recognition libraries were not open to public. It is a matter of time for it to be updated and there are much research going on among Kinect hackers and I would like to discuss about these possibilities.
to:
!! 2. By the time the book was being written, the sound location detection + voice recognition libraries were not open to public. It is a matter of time for it to be updated and there are much research going on among Kinect hackers and I would like to discuss about these possibilities.
Changed lines 10-12 from:
!! there are 2 reasons:

1. I would like to focus more on actual software library issues - comparing latency issues with various existing libraries + possible future updates(ex.skeleton with out body recognition phase, mouth tracking and finger tracking) + wish lists.
to:
!! There are 2 reasons:


!!
1. I would like to focus more on actual software library issues - comparing latency issues with various existing libraries + possible future updates(ex.skeleton with out body recognition phase, mouth tracking and finger tracking) + wish lists.
Changed lines 15-16 from:
'''Strong'''
to:
Changed lines 9-10 from:
there are 2 reasons:
to:

!!
there are 2 reasons:
Changed lines 13-14 from:
2. By the time the book was being written, the sound location detection + voice recognition libraries were not open to public. It is a matter of time for it to be updated and there are much research going on among Kinect hackers and I would like to discuss about these possibilities.
to:
2. By the time the book was being written, the sound location detection + voice recognition libraries were not open to public. It is a matter of time for it to be updated and there are much research going on among Kinect hackers and I would like to discuss about these possibilities.
'''Strong'''
Changed lines 5-12 from:
Why another Kinect report after Greg Borenstein has recently published a book on Kinect, [[http://example.com/]]
to:

!!
Why another Kinect report after Greg Borenstein has recently published a book on Kinect, [[http://shop.oreilly.com/product/0636920020684.do]
]

there are 2 reasons:

1. I would like to focus more on actual software library issues - comparing latency issues with various existing libraries + possible future updates(ex.skeleton with out body recognition phase, mouth tracking and finger tracking) + wish lists.
2. By the time the book was being written, the sound location detection + voice recognition libraries were not open to public. It is a matter of time for it to be updated and there are much research going on among Kinect hackers and I would like to discuss about these possibilities.
Changed line 5 from:
Why another Kinect report after Greg Borenstein has recently published a book on Kinect, "[[Making Things See]]"
to:
Why another Kinect report after Greg Borenstein has recently published a book on Kinect, [[http://example.com/]]
Added lines 2-5:



Why another Kinect report after Greg Borenstein has recently published a book on Kinect, "[[Making Things See]]"
Changed lines 41-50 from:
http://www.wired.com/images_blogs/gadgetlab/2010/11/Canesta-01.png
to:
http://www.wired.com/images_blogs/gadgetlab/2010/11/Canesta-01.png


Older software programs used differences in color and texture to distinguish objects from their backgrounds. PrimeSense, the company whose tech powers Kinect, and recent Microsoft acquisition Canesta use a different model. The camera transmits invisible near-infrared light and measures its “time of flight” after it reflects off the objects.

Time-of-flight works like sonar: If you know how long the light takes to return, you know how far away an object is. Cast a big field, with lots of pings going back and forth at the speed of light, and you can know how far away a lot of objects are.

Using an infrared generator also partially solves the problem of ambient light. Since the sensor isn’t designed to register visible light, it doesn’t get quite as many false positives.

PrimeSense and Kinect go one step further and encode information in the near-IR light. As that information is returned, some of it is deformed — which in turn can help generate a finer image of those objects’ 3-D texture, not just their depth.
Changed lines 37-41 from:
http://www.wired.com/images_blogs/gadgetlab/2010/11/Canesta-howitworks1.jpg
to:
http://www.wired.com/images_blogs/gadgetlab/2010/11/Canesta-howitworks1.jpg

'''Camera'''
Kinect’s camera is powered by both hardware and software. And it does two things: generate a three-dimensional (moving) image of the objects in its field of view, and recognize (moving) human beings among those objects.
http://www.wired.com/images_blogs/gadgetlab/2010/11/Canesta-01.png
Changed lines 33-37 from:
http://itp.nyu.edu/physcomp/sensors/uploads/kinect%20C
to:


'''How Motion Detection Works in Xbox Kinect'''

http://www.wired.com/images_blogs/gadgetlab/2010/11/Canesta-howitworks1.jpg
Changed lines 33-34 from:

Attach
:kinect C
to:
http://itp.nyu.edu/physcomp/sensors/uploads/kinect%20C
Changed line 34 from:
Attach:sftp://bhl236@stu.itp.nyu.edu/home/bhl236/public_html/biomechanics/Screen%20Shot%202012-02-23%20at%206.10.48%20PM.png
to:
Attach:kinect C
Changed lines 34-35 from:
Using the C++ API from nuigroup in Windows, the data can be accessed using the following:
sftp://bhl236@stu.itp.nyu.edu/home/bhl236/public_html/biomechanics/Screen%20Shot%202012-02-23%20at%206.10.48%20PM.png
to:
Attach:sftp://bhl236@stu.itp.nyu.edu/home/bhl236/public_html/biomechanics/Screen%20Shot%202012-02-23%20at%206.10.48%20PM.png
Changed line 35 from:
http://stu.itp.nyu.edu/home/bhl236/public_html/biomechanics/Screen%20Shot%202012-02-23%20at%206.10.48%20PM.png
to:
sftp://bhl236@stu.itp.nyu.edu/home/bhl236/public_html/biomechanics/Screen%20Shot%202012-02-23%20at%206.10.48%20PM.png
Changed line 35 from:
to:
http://stu.itp.nyu.edu/home/bhl236/public_html/biomechanics/Screen%20Shot%202012-02-23%20at%206.10.48%20PM.png
Added lines 17-19:

'''OpenNI is an open source API that is publicly available at http://www.OpenNI.org.'''
Changed lines 24-31 from:
OpenNI supplies a set of APIs to be implemented by the sensor devices, and a set of APIs to be implemented by the middleware components. By breaking the dependency between the sensor and the middleware, OpenNI’s API enables applications to be written and ported with no additional effort to operate on top of different middleware modules (“write once, deploy everywhere”). OpenNI's API also enables middleware developers to write algorithms on top of raw data formats, regardless of which sensor device has produced them, and offers sensor manufacturers the capability to build sensors that power any OpenNI compliant application.
The OpenNI standard API enables natural-interaction application developers to track real-life (3D) scenes by utilizing data types that are calculated from the input of a sensor (for example, representation of a full body, representation of a hand location, an array of the pixels in a depth map and so on). Applications can be written regardless of the sensor or middleware providers.
OpenNI is an open source API that is publicly available at http://www.OpenNI.org.

'''

Using the Kinect Depth Camera with OpenCV'''
to:

'''Using
the Kinect Depth Camera with OpenCV'''
Deleted lines 35-45:
// Raw Depth Data
PUSHORT rawData = (PUSHORT) malloc(640*480*3);
GetNUICameraDepthFrameRAW(KinectCamera, rawData)
In OpenCV you can then use the data this way:


kinectDepthImage = cvCreateImage( cvSize(640,480), 16, 1);
cvSetData(kinectDepthImage, rawData, kinectDepthImage->widthStep);

cvReleaseImageHeader(&kinectDepthImage);
You may find it useful to convert the raw depth data to something approximating physical distance, by using a formula published by Nicolas Burrus or Stéphane Magnenat.
Changed lines 15-16 from:
'''What is OpenNI?
'''
to:
''' What is OpenNI?'''
Added lines 24-47:

'''
Using the Kinect Depth Camera with OpenCV'''


Sam Muscroft has successfully incorporated the raw depth, rgb depth map and rgb output from the Kinect sensor into an OpenCV project using the Windows CL NUI Platform.

RGB output as you'd expect is stored in an 8 bit 3 channel matrix. Depth needs to be stored in a 16 bit 1 channel matrix.

He found the easiest way to output the data (depth & RGB) was to create an image header of the appropriate bit depth and number of channels and populate with the data returned from the open-source Kinect API.

Using the C++ API from nuigroup in Windows, the data can be accessed using the following:

// Raw Depth Data
PUSHORT rawData = (PUSHORT) malloc(640*480*3);
GetNUICameraDepthFrameRAW(KinectCamera, rawData)
In OpenCV you can then use the data this way:


kinectDepthImage = cvCreateImage( cvSize(640,480), 16, 1);
cvSetData(kinectDepthImage, rawData, kinectDepthImage->widthStep);

cvReleaseImageHeader(&kinectDepthImage);
You may find it useful to convert the raw depth data to something approximating physical distance, by using a formula published by Nicolas Burrus or Stéphane Magnenat.
Changed lines 15-23 from:
'''Open source drivers'''
to:
'''What is OpenNI?
'''
OpenNI (Open Natural Interaction) is a multi-language, cross-platform framework that defines APIs for writing applications utilizing Natural Interaction. OpenNI APIs are composed of a set of interfaces for writing NI applications. The main purpose of OpenNI is to form a standard API that enables communication with both:
Vision and audio sensors (the devices that ‘see’ and ‘hear’ the figures and their surroundings.)
Vision and audio perception middleware (the software components that analyze the audio and visual data that is recorded from the scene, and comprehend it). For example, software that receives visual data, such as an image, returns the location of the palm of a hand detected within the image.

OpenNI supplies a set of APIs to be implemented by the sensor devices, and a set of APIs to be implemented by the middleware components. By breaking the dependency between the sensor and the middleware, OpenNI’s API enables applications to be written and ported with no additional effort to operate on top of different middleware modules (“write once, deploy everywhere”). OpenNI's API also enables middleware developers to write algorithms on top of raw data formats, regardless of which sensor device has produced them, and offers sensor manufacturers the capability to build sensors that power any OpenNI compliant application.
The OpenNI standard API enables natural-interaction application developers to track real-life (3D) scenes by utilizing data types that are calculated from the input of a sensor (for example, representation of a full body, representation of a hand location, an array of the pixels in a depth map and so on). Applications can be written regardless of the sensor or middleware providers.
OpenNI is an open source API that is publicly available at http://www.OpenNI.org.
Changed lines 1-3 from:
Kinect is a motion sensing input device by Microsoft for the Xbox 360 video game console and Windows PCs. Based around a webcam-style add-on peripheral for the Xbox 360 console, it enables users to control and interact with the Xbox 360 without the need to touch a game controller, through a natural user interface using gestures and spoken commands. The project is aimed at broadening the Xbox 360's audience beyond its typical gamer base. Kinect competes with the Wii Remote Plus and PlayStation Move with PlayStation Eye motion controllers for the Wii and PlayStation 3 home consoles, respectively. A version for Windows was released on February 1, 2012
to:
'''Kinect is a motion sensing input device by Microsoft''' for the Xbox 360 video game console and Windows PCs. Based around a webcam-style add-on peripheral for the Xbox 360 console, it enables users to control and interact with the Xbox 360 without the need to touch a game controller, through a natural user interface using gestures and spoken commands. The project is aimed at broadening the Xbox 360's audience beyond its typical gamer base. Kinect competes with the Wii Remote Plus and PlayStation Move with PlayStation Eye motion controllers for the Wii and PlayStation 3 home consoles, respectively. A version for Windows was released on February 1, 2012
Changed lines 14-15 from:
[[<iframe src="http://player.vimeo.com/video/706938?title=0&amp;byline=0&amp;portrait=0" width="400" height="251" frameborder="0" webkitAllowFullScreen mozallowfullscreen allowFullScreen></iframe><p><a href="http://vimeo.com/706938">Reverse Shadow Theatre</a> from <a href="http://vimeo.com/gabor">gabor papp</a> on <a href="http://vimeo.com">Vimeo</a>.</p>]]
to:
'''Open source drivers'''
Changed lines 13-15 from:
<iframe src="http://player.vimeo.com/video/706938?title=0&amp;byline=0&amp;portrait=0" width="400" height="251" frameborder="0" webkitAllowFullScreen mozallowfullscreen allowFullScreen></iframe><p><a href="http://vimeo.com/706938">Reverse Shadow Theatre</a> from <a href="http://vimeo.com/gabor">gabor papp</a> on <a href="http://vimeo.com">Vimeo</a>.</p>

http://www.gamersmint.com/wp-content/uploads/2010/10/0908-kinect_full_600.jpg
to:
http://www.gamersmint.com/wp-content/uploads/2010/10/0908-kinect_full_600.jpg
[[<iframe src
="http://player.vimeo.com/video/706938?title=0&amp;byline=0&amp;portrait=0" width="400" height="251" frameborder="0" webkitAllowFullScreen mozallowfullscreen allowFullScreen></iframe><p><a href="http://vimeo.com/706938">Reverse Shadow Theatre</a> from <a href="http://vimeo.com/gabor">gabor papp</a> on <a href="http://vimeo.com">Vimeo</a>.</p>]]
Added lines 12-13:

<iframe src="http://player.vimeo.com/video/706938?title=0&amp;byline=0&amp;portrait=0" width="400" height="251" frameborder="0" webkitAllowFullScreen mozallowfullscreen allowFullScreen></iframe><p><a href="http://vimeo.com/706938">Reverse Shadow Theatre</a> from <a href="http://vimeo.com/gabor">gabor papp</a> on <a href="http://vimeo.com">Vimeo</a>.</p>
Deleted line 9:
http://3.bp.blogspot.com/_lLRlbYm8ZEc/TSc5VjNrdjI/AAAAAAAAACw/CplHiAPyg0Y/s1600/xBox-360-kinect-indepth.jpg
Changed lines 1-3 from:
Kinect is a motion sensing input device by Microsoft for the Xbox 360 video game console and Windows PCs. Based around a webcam-style add-on peripheral for the Xbox 360 console, it enables users to control and interact with the Xbox 360 without the need to touch a game controller, through a natural user interface using gestures and spoken commands.[10] The project is aimed at broadening the Xbox 360's audience beyond its typical gamer base.[11] Kinect competes with the Wii Remote Plus and PlayStation Move with PlayStation Eye motion controllers for the Wii and PlayStation 3 home consoles, respectively. A version for Windows was released on February 1, 2012
to:
Kinect is a motion sensing input device by Microsoft for the Xbox 360 video game console and Windows PCs. Based around a webcam-style add-on peripheral for the Xbox 360 console, it enables users to control and interact with the Xbox 360 without the need to touch a game controller, through a natural user interface using gestures and spoken commands. The project is aimed at broadening the Xbox 360's audience beyond its typical gamer base. Kinect competes with the Wii Remote Plus and PlayStation Move with PlayStation Eye motion controllers for the Wii and PlayStation 3 home consoles, respectively. A version for Windows was released on February 1, 2012
Changed lines 10-11 from:
to:
http://3.bp.blogspot.com/_lLRlbYm8ZEc/TSc5VjNrdjI/AAAAAAAAACw/CplHiAPyg0Y/s1600/xBox-360-kinect-indepth.jpg
The Kinect sensor is a horizontal bar connected to a small base with a motorized pivot and is designed to be positioned lengthwise above or below the video display. The device features an "RGB camera, depth sensor and multi-array microphone running proprietary software", which provide full-body 3D motion capture, facial recognition and voice recognition capabilities.
Added lines 6-8:

This infrared image shows the laser grid Kinect uses to calculate depth
http://upload.wikimedia.org/wikipedia/commons/thumb/7/76/Kinect2-ir-image.png/220px-Kinect2-ir-image.png
Added lines 1-6:
Kinect is a motion sensing input device by Microsoft for the Xbox 360 video game console and Windows PCs. Based around a webcam-style add-on peripheral for the Xbox 360 console, it enables users to control and interact with the Xbox 360 without the need to touch a game controller, through a natural user interface using gestures and spoken commands.[10] The project is aimed at broadening the Xbox 360's audience beyond its typical gamer base.[11] Kinect competes with the Wii Remote Plus and PlayStation Move with PlayStation Eye motion controllers for the Wii and PlayStation 3 home consoles, respectively. A version for Windows was released on February 1, 2012


The depth map is visualized here using color gradients from white (near) to blue (far)
http://upload.wikimedia.org/wikipedia/commons/thumb/9/90/Kinect2-deepmap.png/220px-Kinect2-deepmap.png
Deleted line 0:
Added lines 1-5:



http://3.bp.blogspot.com/_lLRlbYm8ZEc/TSc5VjNrdjI/AAAAAAAAACw/CplHiAPyg0Y/s1600/xBox-360-kinect-indepth.jpg
Changed line 1 from:
http://www.xbox.com/Kinect/Entertainment
to:
http://www.gamersmint.com/wp-content/uploads/2010/10/0908-kinect_full_600.jpg
Changed line 1 from:
sdfsdf
to:
http://www.xbox.com/Kinect/Entertainment
Added line 1:
sdfsdf