Reports.MicrosoftKinect History

Hide minor edits - Show changes to output

Changed lines 98-100 from:
[[http://jmpelletier.com/freenect/|""jit.freenect.grab""]] by Jean-Marc Pelletier
-depth sensor & RGB camera only
[[http://dtr.noisepages.com/2011/02/2-methods-for-undistorting-the-kinect-depth-map-in-maxjitter/|""2 methods for undistorting the Kinect depth map in Max/Jitter""]]
to:
[[http://jmpelletier.com/freenect/|"jit.freenect.grab"]] by Jean-Marc Pelletier\\
'''
depth sensor & RGB camera only'''\\
[[http://dtr.noisepages.com/2011/02/2-methods-for-undistorting-the-kinect-depth-map-in-maxjitter/|"2 methods for undistorting the Kinect depth map in Max/Jitter"]]\\
Changed lines 3-4 from:
The Kinect is a depth (3D) camera. Regular cameras collect the light that bounces off the objects around them. In addition to having a regular camera, the Kinect also has an infrared (IR) projector and camera. The IR projector projects IR light so that the IR camera can then "see" how far objects are from it (see below).
to:
The Kinect is a motion sensing input device by Microsoft, originally designed for the Xbox 360 video game console. The primary sensor employed by the Kinect is an IR camera. Laser beams are sprayed out by an IR projector, bounce back & are received by an IR camera (this is how distances are detected -- see below). Additionally, the Kinect has an RGB camera (a webcam, basically) with a maximum resolution of 640 x 480.
Added lines 92-101:

!!![++Skeletal Tracking++]

[[http://synapsekinect.tumblr.com/|'''Synapse''']]

!!![++Max/MSP/Jitter++]
[[http://jmpelletier.com/freenect/|""jit.freenect.grab""]] by Jean-Marc Pelletier
-depth sensor & RGB camera only
[[http://dtr.noisepages.com/2011/02/2-methods-for-undistorting-the-kinect-depth-map-in-maxjitter/|""2 methods for undistorting the Kinect depth map in Max/Jitter""]]
Changed lines 12-13 from:
!!!Sources
to:
!!![++Sources++]
Changed lines 24-25 from:
!!!Accessories
to:
!!![++Accessories++]
Changed lines 30-31 from:
!!!Applications
to:
!!![++Applications++]
Changed lines 36-37 from:
!!!Technical Characteristics
to:
!!![++Technical Characteristics++]
Changed lines 52-53 from:
!!!Hardware Requirements
to:
!!![++Hardware Requirements++]
Changed lines 63-64 from:
!!!Software Requirements
to:
!!![++Software Requirements++]
Changed lines 75-76 from:
!!!Setting up your Kinect with Simple OpenNI and Processing
to:
!!![++Setting up your Kinect with Simple OpenNI and Processing++]
Changed lines 88-89 from:
!!!Code Samples
to:
!!![++Code Samples++]
Changed lines 94-95 from:
!!!Books
to:
!!![++Books++]
Changed lines 100-101 from:
!!![+Typical Behavior+]
to:
!!![++Typical Behavior++]
Changed lines 113-114 from:
!!!Application Notes
to:
!!![++Application Notes++]
Changed lines 89-92 from:
[[http://www.apress.com/9781430238881|'''Meet the Kinect: An Introduction to Programming Natural User Interfaces''']] by Sean Kean , Jonathan Hall , Phoenix Perry


!!!Typical Behavior
to:
[[http://www.apress.com/9781430238881|'''Meet the Kinect: An Introduction to Programming Natural User Interfaces''']] by Sean Kean, Jonathan Hall, Phoenix Perry


!!![+Typical Behavior+]
Changed line 88 from:
[[http://shop.oreilly.com/product/0636920020684.do|'''Making Things See''']] by Greg Borenstein, published by O’Reilly\\
to:
[[http://shop.oreilly.com/product/0636920020684.do|'''Making Things See''']] by Greg Borenstein, published by O’Reilly
Changed line 88 from:
[["http://shop.oreilly.com/product/0636920020684.do|'''Making Things See''']] by Greg Borenstein, published by O’Reilly\\
to:
[[http://shop.oreilly.com/product/0636920020684.do|'''Making Things See''']] by Greg Borenstein, published by O’Reilly\\
Changed line 88 from:
[["http://code.google.com/p/simple-openni/http://shop.oreilly.com/product/0636920020684.do|'''Making Things See''']] by Greg Borenstein, published by O’Reilly\\
to:
[["http://shop.oreilly.com/product/0636920020684.do|'''Making Things See''']] by Greg Borenstein, published by O’Reilly\\
Changed line 106 from:
[[http://lilyszajnberg.com/blog/?page_id=470|'''How to get set up with the Kinect''']\\
to:
[[http://lilyszajnberg.com/blog/?page_id=470|'''How to get set up with the Kinect''']]\\
Changed lines 83-85 from:
You can find all of Greg’s Making Things See examples on his [[https://github.com/atduskgreg/Making-Things-See-Examples|GitHub page]].
to:
You can find all of Greg’s Making Things See examples on his [[https://github.com/atduskgreg/Making-Things-See-Examples|'''GitHub page''']].
Changed line 108 from:
[[http://motionassessment.com/|'''Kinect Abnormal Motion Assessment System''']]\\
to:
[[http://motionassessment.com/|'''Kinect Abnormal Motion Assessment System''']]
Changed lines 25-27 from:
NYKO Zoom for Kinect | $16
to:
[[http://www.amazon.com/Zoom-Kinect-Xbox-360/dp/B0050SYS5A/ref=sr_1_3?ie=UTF8&qid=1329858800&sr=8-3|'''NYKO Zoom for Kinect''']] | $16
Changed lines 62-63 from:
*Visual Basic using Microsoft Visual Studio
to:
*[[http://en.wikipedia.org/wiki/Visual_Basic|'''Visual Basic''']] using [[http://en.wikipedia.org/wiki/Microsoft_Visual_Studio_2010#Visual_Studio_2010|'''Microsoft Visual Studio''']]
Changed lines 71-73 from:
The simple OpenNI for Processing project home, which includes all support files and download versions, can be found here:
http:
//code.google.com/p/simple-openni/
to:
The simple OpenNI for Processing project home, which includes all support files and download versions, can be found [[http://code.google.com/p/simple-openni/|"here"]]:
Changed lines 83-86 from:
You can find all of Greg’s Making Things See examples on his GitHub page:
https:
//github.com/atduskgreg/Making-Things-See-Examples
to:
You can find all of Greg’s Making Things See examples on his [[https://github.com/atduskgreg/Making-Things-See-Examples|GitHub page]].
Changed lines 88-91 from:
Making Things See by Greg Borenstein, published by O’Reilly\\
Meet the Kinect
: An Introduction to Programming Natural User Interfaces by Sean Kean , Jonathan Hall , Phoenix Perry
to:
[["http://code.google.com/p/simple-openni/http://shop.oreilly.com/product/0636920020684.do|'''Making Things See''']] by Greg Borenstein, published by O’Reilly\\
[[http://www.apress.com/9781430238881|'''Meet the Kinect: An Introduction to Programming Natural User Interfaces''']]
by Sean Kean , Jonathan Hall , Phoenix Perry
Changed lines 106-108 from:
How to get set up with the Kinect\\
NYKO Zoom specs\\
Kinect Abnormal Motion Assessment System\\
to:
[[http://lilyszajnberg.com/blog/?page_id=470|'''How to get set up with the Kinect''']\\
[[http://lilyszajnberg.com/blog/?page_id=470|'''NYKO Zoom specs''']]\\
[[http://motionassessment.com/|'''Kinect Abnormal Motion Assessment System''']]\\
Changed lines 14-20 from:
Microsoft Kinect Windows SDK (Windows compatible) | $249\\
Microsft Kinect for XBox (currently Mac, Windows, Linux compatible) | $149\\
Processing | Free - Open Source\\
Simple OpenNI for Processing | Free - Open Source\\
OpenNI NITE | Free\\
Microsoft
Visual Studio 2010 | $799\\
Microsoft Visual Basic | Free (requires Visual Studio)
to:
[[http://www.microsoft.com/en-us/kinectforwindows/purchase/|'''Microsoft Kinect Windows SDK (Windows compatible)''']] | $249\\
[[http://www.bestbuy.com/site/Microsoft+-+Kinect+for+Xbox+360/1036858.p?id=1218212157998&skuId=1036858/|'''Microsft Kinect for XBox (currently Mac, Windows, Linux compatible)''']] | $149\\
[[http://processing.org/download/|'''Processing''']] | Free - Open Source\\
[[http://code.google.com/p/simple-openni/|'''Simple OpenNI for Processing''']] | Free - Open Source\\
[[http://code.google.com/p/simple-openni/|'''OpenNI NITE''']] | Free\\
[[http://www.microsoftstore.com/store/msstore/en_US/pd/productID.216633300/parentCategoryID.50804600/categoryID.50804700/list.true|'''Microsoft Visual Studio 2010''']] | $799\\
[[http://www.microsoft.com/visualstudio/en-us/products/2010-editions/visual-basic-express|'''Microsoft Visual Basic''']] | Free (requires Visual Studio)\\
Changed line 90 from:
Making Things See by Greg Borenstein, published by O’Reilly
to:
Making Things See by Greg Borenstein, published by O’Reilly\\
Changed lines 108-110 from:
How to get set up with the Kinect
NYKO Zoom specs
Kinect Abnormal Motion Assessment System
to:
How to get set up with the Kinect\\
NYKO Zoom specs\\
Kinect Abnormal Motion Assessment System\\
Changed lines 14-19 from:
Microsoft Kinect Windows SDK (Windows compatible) | $249
Microsft Kinect for XBox (currently Mac, Windows, Linux compatible) | $149
Processing | Free - Open Source
Simple OpenNI for Processing | Free - Open Source
OpenNI NITE | Free
Microsoft Visual Studio 2010 | $799
to:
Microsoft Kinect Windows SDK (Windows compatible) | $249\\
Microsft Kinect for XBox (currently Mac, Windows, Linux compatible) | $149\\
Processing | Free - Open Source\\
Simple OpenNI for Processing | Free - Open Source\\
OpenNI NITE | Free\\
Microsoft Visual Studio 2010 | $799\\
Changed line 99 from:
Large reflective surfaces can interfere with the infrared sensor, particularly in regards to skeleton tracking
to:
*''Reflective surfaces'': Can interfere with the infrared sensor, particularly in regards to skeleton tracking
Deleted line 14:
Deleted line 15:
Deleted line 16:
Deleted line 17:
Deleted line 18:
Deleted line 19:
Changed lines 36-43 from:
Communicates serially via USB

Standard frame rate: 30fps

Resolution: 640x480
to:
*Communicates serially via USB
*Standard frame rate: 30fps
*Resolution: 640x480
Changed lines 43-49 from:
Communicates serially via USB

Standard frame rate: 30fps

Resolution: 640x480
to:
*Communicates serially via USB
*Standard frame rate: 30fps
*Resolution: 640x480
Changed lines 51-62 from:
32-bit (x86) or 64-bit (x64) processor

Dual-core 2.66-GHz or faster processor

Dedicated USB 2.0 bus

2 GB RAM

A Microsoft Kinect for Windows sensor
to:
*32-bit (x86) or 64-bit (x64) processor
*Dual-core 2.66-GHz or faster processor
*Dedicated USB 2.0 bus
*2 GB RAM
*A Microsoft Kinect for Windows sensor
Changed lines 61-65 from:
Windows 7 and Windows 8 Developer Preview or Visual Basic

using Microsoft Visual Studio
to:
*Windows 7 and Windows 8 Developer Preview
*Visual
Basic using Microsoft Visual Studio
Changed lines 65-70 from:
Processing

Windows, Mac, or Linux
to:
*Processing
*Windows, Mac, or Linux
Deleted line 90:
Changed line 98 from:
''Natural sunlight'': If you are having trouble calibrating, check the amount of sunlight in the room. There is infrared light in sunlight which throws off the ability of the Kinect to read the 3D space. Because it uses IR to read depth, it does not require indoor lighting to work
to:
*''Natural sunlight'': If you are having trouble calibrating, check the amount of sunlight in the room. There is infrared light in sunlight which throws off the ability of the Kinect to read the 3D space. Because it uses IR to read depth, it does not require indoor lighting to work
Changed lines 101-105 from:
''Calibration'': The Windows SDK version does not require calibration for skeleton tracking, but the XBox version does. This means to use skeleton tracking, you have to have the user stand in submissive pose to calibrate the Kinect before using it. This takes ~5-10 seconds

''Distance'': The Kinect can only sense where you are if you are in a certain area. If you are too close, or too far, it cannot establish a readable depth image. The Kinect for Windows has “Near Mode” which allows closer sensing capabilities. Using the NYKO Zoom also helps. This is a chart of distance limitations from Microsoft.
to:
*''Calibration'': The Windows SDK version does not require calibration for skeleton tracking, but the XBox version does. This means to use skeleton tracking, you have to have the user stand in submissive pose to calibrate the Kinect before using it. This takes ~5-10 seconds

*''Distance'': The Kinect can only sense where you are if you are in a certain area. If you are too close, or too far, it cannot establish a readable depth image. The Kinect for Windows has “Near Mode” which allows closer sensing capabilities. Using the NYKO Zoom also helps. This is a chart of distance limitations from Microsoft.
Deleted line 108:
Deleted line 109:
Added line 15:
Added line 17:
Added line 19:
Added line 21:
Added line 23:
Added line 25:
Changed lines 41-42 from:
''Microsoft Kinect for XBox''
to:
'''Microsoft Kinect for XBox'''
Added line 44:
Added line 46:
Changed lines 49-51 from:
''Microsoft Kinect for Windows SDK''
to:
'''Microsoft Kinect for Windows SDK'''
Added line 53:
Added line 55:
Changed lines 61-62 from:
Kinect for Windows SDK
to:
'''Kinect for Windows SDK'''
Added line 64:
Added line 66:
Added line 68:
Added line 70:
Changed lines 76-77 from:
''Kinect for Windows SDK''
to:
'''Kinect for Windows SDK'''
Added line 79:
Changed lines 82-83 from:
''Kinect for XBox''
to:
'''Kinect for XBox'''
Added line 85:
Added line 88:
Added line 95:
Added line 98:
Added line 111:
Added line 114:
Added line 118:
Added line 121:
Added line 123:
Added line 130:
Added line 132:
Changed lines 58-63 from:
Kinect for Windows SDK
Windows 7 and Windows 8 Developer Preview


!!!Setting up your Kinect and software
to:
''Kinect for Windows SDK''
Windows 7 and Windows 8 Developer Preview or Visual Basic
using Microsoft Visual Studio

''Kinect for XBox''
Processing
Windows, Mac, or Linux

!!!Setting up your Kinect with Simple OpenNI and Processing
Added line 70:
Changed lines 86-101 from:
Meet the Kinect: An Introduction to Programming Natural User Interfaces by Sean Kean , Jonathan Hall , Phoenix Perry
to:
Meet the Kinect: An Introduction to Programming Natural User Interfaces by Sean Kean , Jonathan Hall , Phoenix Perry

!!!Typical Behavior

'''Peculiarities'''
''Natural sunlight'': If you are having trouble calibrating, check the amount of sunlight in the room. There is infrared light in sunlight which throws off the ability of the Kinect to read the 3D space. Because it uses IR to read depth, it does not require indoor lighting to work
Large reflective surfaces can interfere with the infrared sensor, particularly in regards to skeleton tracking
''Calibration'': The Windows SDK version does not require calibration for skeleton tracking, but the XBox version does. This means to use skeleton tracking, you have to have the user stand in submissive pose to calibrate the Kinect before using it. This takes ~5-10 seconds
''Distance'': The Kinect can only sense where you are if you are in a certain area. If you are too close, or too far, it cannot establish a readable depth image. The Kinect for Windows has “Near Mode” which allows closer sensing capabilities. Using the NYKO Zoom also helps. This is a chart of distance limitations from Microsoft.


!!!Application Notes

How to get set up with the Kinect
NYKO Zoom specs
Kinect Abnormal Motion Assessment System
Changed line 35 from:
"'Microsoft Kinect for XBox"'
to:
''Microsoft Kinect for XBox''
Changed line 40 from:
'"Microsoft Kinect for Windows SDK"'
to:
''Microsoft Kinect for Windows SDK''
Changed line 35 from:
"Microsoft Kinect for XBox"
to:
"'Microsoft Kinect for XBox"'
Changed line 40 from:
"Microsoft Kinect for Windows SDK"
to:
'"Microsoft Kinect for Windows SDK"'
Changed line 35 from:
Microsoft Kinect for XBox
to:
"Microsoft Kinect for XBox"
Changed lines 39-40 from:
Microsoft Kinect for Windows SDK
to:

"
Microsoft Kinect for Windows SDK"
Changed lines 9-11 from:
to:
The Kinect uses proprietary middleware called OpenNI NITE developed by PrimeSense, the company that developed the hardware in the Kinect. PrimeSense just released their own 3D camera called the ASUS Xtion Pro. OpenNI NITE runs a complex algorithm that translates your skeleton data so that it can be used in software programs. The Kinect reads the 3D depth image, OpenNI NITE translates the data of where each of the users’ limbs are in space, and then that data can be used in a front-end software platform like Processing to do any number of things that use your body as a controller.
Changed lines 15-18 from:
Microsft Kinect for XBox (currently Mac, Windows, Linux compatible)
Simple OpenNI for Processing
to:
Microsft Kinect for XBox (currently Mac, Windows, Linux compatible) | $149
Processing | Free - Open Source
Simple OpenNI for Processing | Free - Open Source
OpenNI NITE | Free
Microsoft Visual Studio 2010 | $799
Microsoft Visual Basic | Free (requires Visual Studio)
Changed line 30 from:
Describe some typical applications of this sensor. You can often get this from the datasheet, but a few examples from companies or individuals who've used it would be useful as well.
to:
[[http://motionassessment.com/|'''Kinect Abnormal Motion Assessment System''']]
Changed line 5 from:
to:
Attach:kinect_view_hands.jpg
Changed line 1 from:
"Attach:kinect_specs.jpg"
to:
Attach:kinect_specs.jpg
Changed line 1 from:
"(Attach:)kinect_specs.jpg"
to:
"Attach:kinect_specs.jpg"
Changed line 1 from:
[[(Attach:)kinect_specs.jpg]]
to:
"(Attach:)kinect_specs.jpg"
Changed line 1 from:
"Attach: kinect_specs.jpg"
to:
[[(Attach:)kinect_specs.jpg]]
Changed line 1 from:
Attach: kinect_specs.jpg
to:
"Attach: kinect_specs.jpg"
Added lines 1-2:
Attach: kinect_specs.jpg
Changed lines 8-9 from:
Sources
to:
!!!Sources
Changed lines 15-16 from:
Accessories
to:
!!!Accessories
Changed lines 20-21 from:
Applications
to:
!!!Applications
Changed lines 25-26 from:
Technical Characteristics
to:
!!!Technical Characteristics
Changed lines 37-38 from:
Hardware Requirements
to:
!!!Hardware Requirements
Changed lines 47-48 from:
Software Requirements
to:
!!!Software Requirements
Changed lines 53-54 from:
Setting up your Kinect and software
to:
!!!Setting up your Kinect and software
Changed lines 63-64 from:
Code Samples
to:
!!!Code Samples
Changed line 69 from:
Books
to:
!!!Books
Added lines 1-72:
The Kinect is a depth (3D) camera. Regular cameras collect the light that bounces off the objects around them. In addition to having a regular camera, the Kinect also has an infrared (IR) projector and camera. The IR projector projects IR light so that the IR camera can then "see" how far objects are from it (see below).



This is how it "knows" where your hand is and what gives you the capability to use your body as a controller. Obviously 3D scanning capabilities are not new; the government has been using them for years as have NASA and architects and other high end institutions that could afford the technology. What is ground-breaking about the Kinect is that it lets you do 3D sensing at $150. That is crazy cheap.


Sources

Microsoft Kinect Windows SDK (Windows compatible) | $249
Microsft Kinect for XBox (currently Mac, Windows, Linux compatible)
Simple OpenNI for Processing


Accessories

NYKO Zoom for Kinect | $16


Applications

Describe some typical applications of this sensor. You can often get this from the datasheet, but a few examples from companies or individuals who've used it would be useful as well.


Technical Characteristics

Microsoft Kinect for XBox
Communicates serially via USB
Standard frame rate: 30fps
Resolution: 640x480
Microsoft Kinect for Windows SDK
Communicates serially via USB
Standard frame rate: 30fps
Resolution: 640x480


Hardware Requirements

Kinect for Windows SDK
32-bit (x86) or 64-bit (x64) processor
Dual-core 2.66-GHz or faster processor
Dedicated USB 2.0 bus
2 GB RAM
A Microsoft Kinect for Windows sensor


Software Requirements

Kinect for Windows SDK
Windows 7 and Windows 8 Developer Preview


Setting up your Kinect and software

The simple OpenNI for Processing project home, which includes all support files and download versions, can be found here:
http://code.google.com/p/simple-openni/
1. Install OpenNI NITE. This is what allows us to access all of the 3D data from the Kinect for use in Processing. It is the software written by PrimeSense that allows us to communicate with the Kinect. This is the only proprietary software used with the Kinect. To install, download the file, then launch the Terminal (Application s –>Utilities –> Terminal). Then change the directory by typing “cd” and dragging the unzipped folder to the Terminal which will set the correct file path. Hit return and type sudo ./install.sh to run the installer.
2. Install Simple OpenNI. This is the library that allows us to communicate the data from OpenNI to Processing. To do this, install and unzip the file linked above. Once unzipped, drag the folder to your libraries folder in Processing. Once it is there, restart Processing.
**Note: In initial Kinect stages, it was necessary to use OSC to communicate skeleton data back and forth from Processing to the Kinect on OS X. The skeleton data, or “skeletonization” is what reads where your skeleton (joints/limbs) are in 3D space. This is what allows you to designate various parts of your body to be a “controller”. Fortunately, PrimeSense released their software (OpenNI NITE) that serves as middleware to perform the skeletonization. This makes things much easier.
3. Launch Processing, plug in your Kinect and get started coding!


Code Samples

You can find all of Greg’s Making Things See examples on his GitHub page:
https://github.com/atduskgreg/Making-Things-See-Examples


Books

Making Things See by Greg Borenstein, published by O’Reilly
Meet the Kinect: An Introduction to Programming Natural User Interfaces by Sean Kean , Jonathan Hall , Phoenix Perry