Touch Free Gesture using Computer Vision Presence Detection to play Music composed by myself.
The Max Patch opens the webcam feed and uses a low resolution. The jit.change object filters out repeating frames and the jit.rgb2luma converts ARGB color to Black and White. I used the jit.scissors object to divide the frame into six equal regions. The jit.3m object reports minimum, mean and maximum values from the webcam feed and the < 100 object compares the value. If the value is less than 100 it sends a bang to trigger a sound. I have mapped several sounds to some regions to give more intuitive control to the gesture in sculpting the music. To make the sounds visually expressive, I integrated a generative aspect that emits particles each time the region of the frame is triggered. I layered the particles with some visuals that further enhance the compelling interaction of the gesture with sound. The visuals are composed of FFTs and glowing violet hues. When you play the touch-less instrument, the visual feedback is effectively responding to the gesture in front of the webcam.