Interactive Music – ITP Spring Show 2017 /shows/spring2017/ Wed, 24 May 2017 21:33:20 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.5 /shows/spring2017/wp-content/uploads/sites/40/2017/03/cropped-springshowFINAL_webimage-1-32x32.png Interactive Music – ITP Spring Show 2017 /shows/spring2017/ 32 32 Animalia Concerto /shows/spring2017/animalia-concerto/ Tue, 16 May 2017 19:39:52 +0000 https://itp.nyu.edu/shows/spring2017/animalia-concerto/ Continue reading "Animalia Concerto"

]]>
Rushali Paratey

A gesture based performance piece by humans wearing animal masks generating music with computers.

https://rushali.space/#/animalia-concerto/

Description

The project uses the gest,js library to detect up, down, left and right hand gestures and that triggers sound files to start, stop or add an effect. This one three different computers was playing percussion, a string instrument names sarang and a flute but ideally we could add more instruments ergo musicians. These musicians are instructed by a conductor.

Classes

Interactive Music

]]>
Browser LAB /shows/spring2017/browser-lab/ Tue, 16 May 2017 19:39:18 +0000 https://itp.nyu.edu/shows/spring2017/browser-lab/ Continue reading "Browser LAB"

]]>
Eric Wenqi Li

Browser LAB is a playground to experience novel capacity of the browser through playing with virtual physical attributes of browser windows, like size, position, color, layout, and sound.

https://www.wenqili.com/browser-lab

Description

We like web pages and web applications. The experience of the modern web browsing is within a browser window or a window tab, which indeed is an user-centered and practical service design.
Web pages are like paintings, the browser window is the canvases and pigments of the paintings. We focus more on the contents made out of canvas and pigments, but we might rarely notice the canvas and pigments themselves also have great stories to tell.
Browser lab is an experiment project that treats browser windows as the contents of expression, the material of design and the storyteller of emotion, not only the canvas. In browser lab, instead of interacting with HTML elements within only one window, users build a connection with the browser windows themselves directly. It's a playground to gain more knowledge about browser window through playing with its virtual physical attributes, like size, position, color, layout and sound.
In browser lab, 3 scenes were built separately to show the novel experience under the essence of chrome browser.
The first scene is a color grid layout to play with multi-window interfaces, trying to experiment the color and sound of windows, and how they might be organized as a group.
The second scene is a touch interaction with one window, trying to experiment the emotion and physical connections between the browser and human beings.
The third scene is a browser window overlay sculpture, trying to experiment capacity for storytelling of browser windows. More scenes and experiment are being added.

Classes

Hacking the Browser, Interactive Music

]]>
Somnolent Listener /shows/spring2017/somnolent-listener/ Tue, 16 May 2017 19:39:12 +0000 https://itp.nyu.edu/shows/spring2017/somnolent-listener/ Utsav Chadha

Making lectures interactive.

https://www.utsav-chadha.com/somnolent-listener

Description

Somnolent listener is a software that makes lectures interactive using speech recognition and synthesis.

The listener interprets speech, produces subtitles, captures keywords, records a transcript, creates visualizations and also makes music.

Classes

Interactive Music, Nature of Code (Animation and Physics)

]]>
Multisensory Music /shows/spring2017/multisensory-music/ Tue, 16 May 2017 19:39:04 +0000 https://itp.nyu.edu/shows/spring2017/multisensory-music/ Continue reading "Multisensory Music"

]]>
Michael Simpson

A tool for creating audio/visual experiences that provides the composer with realtime multi-sensory feedback (audio/visual/haptic) tied to their physical interaction.

http://mgs.nyc/syntk/

Description

This project offers the viewer a series of different musical interfaces (software) and allows them to interact with both the audio and the visual elements which are being projected onto a screen. The idea is to provide the audience with a multi-sensory AV experience by tightly weaving sound, graphics, and physical gestures (via mouse/wacom/drum pads/faders).

Classes

Data Art, Interactive Music, Nature of Code (Animation and Physics), Nature of Code (Intelligence and Learning), Shader Studio

]]>
Sound Objects /shows/spring2017/sound-objects/ Tue, 16 May 2017 19:38:25 +0000 https://itp.nyu.edu/shows/spring2017/sound-objects/ Continue reading "Sound Objects"

]]>
Or Fleisher, Scott Reitherman

A sonic VR experience allowing players to orchestrate a musical soundscape thru interactions with dynamic objects and playful physics.

http://orfleisher.com/portfolio-item/sound-objects/

Description

My partner Or Fleisher and I have populated a desert landscape with variously shaped bounceable objects where each object is tied to an array of ambient music sounds. Related sonic clusters of sound which span the eq spectrum are grouped and tied to objects, and the sounds are triggered as their corresponding object bounces on the ground. By bouncing the object harder it travels higher, effectively stretching the cycle length of each array’s loop. And as each object pulls randomly from an array of note values, unpredictable melodies and combinations of harmonious sounds unfold and play out as the user travels around the desert tending to their dynamic soundscape in motion.

Classes

Project Development Studio, Interactive Music

]]>
Touch Free Music /shows/spring2017/touch-free-music/ Tue, 16 May 2017 19:37:03 +0000 https://itp.nyu.edu/shows/spring2017/touch-free-music/ Continue reading "Touch Free Music"

]]>
Brandon Kader

Touch Free Gesture using Computer Vision Presence Detection to play Music composed by myself.

http://https://www.brandonkader.com/interactive-music/

Description

The Max Patch opens the webcam feed and uses a low resolution. The jit.change object filters out repeating frames and the jit.rgb2luma converts ARGB color to Black and White. I used the jit.scissors object to divide the frame into six equal regions. The jit.3m object reports minimum, mean and maximum values from the webcam feed and the < 100 object compares the value. If the value is less than 100 it sends a bang to trigger a sound. I have mapped several sounds to some regions to give more intuitive control to the gesture in sculpting the music. To make the sounds visually expressive, I integrated a generative aspect that emits particles each time the region of the frame is triggered. I layered the particles with some visuals that further enhance the compelling interaction of the gesture with sound. The visuals are composed of FFTs and glowing violet hues. When you play the touch-less instrument, the visual feedback is effectively responding to the gesture in front of the webcam.

Classes

Interactive Music

]]>