airchestra

Aaron Montoya-Moraga, Philip J Donaldson

We have created a way to allow users to manipulate music with simple gestures

http://pancake.wtf/2015/12/02/icm-final-update-12115/ // http://www.aaronmontoyamoraga.com/?p=636

Description

Music has been a large motivating factor for both Aaron’s and my work thus far at ITP. For this project we sought to explore how people might interact with sound in a gestural environment, and hopefully made music a little more fun and approachable for those who haven’t played musical instruments. Our project gives users two new instruments to experiment with, the Kinect and Leap Motion sensors. By simply gesturing over the instruments, users will begin to uncover all the music they can make with just a twist, flick or roll of the wrist.

The project at it’s core works by sending raw distance data from the sensors into our programming environment. From there we parse the data and send it to Max for synthesis. The Kinect and Leap Motion are powerful pieces of hardware which make them ideal choices for supporting sound manipulation. Each device can read so much about your body or hands it was great getting to experiment with gesture to create instruments which would make sense to the user. This work has opened the door for experimentation with interfaces not traditionally created for musical expression. We are looking forward to continuing this work to create unique instruments which sound great and are fun to play. Enjoy!

Classes

Introduction to Computational Media

Rube Telephone

Chino Kim, Aaron Montoya-Moraga

A Rube Goldberg telephone that tangibly processes and translates conversation between two people, creating a fun audiovisual experience using various old and new communications technologies and producing unexpected outcomes.

Description

Using Morse code, language translation and optical character recognition, our contraption processes and manipulates conversation between two hacked rotary phones. Each step in the chain reaction is a tangible experience and occasionally distorts the message as it passes it along.

A one-way transmission may look like this:

Someone speaks into phone 1 > their sentence gets tapped out on a Morse key by a solenoid > the resulting Morse code gets translated back to English and then to Spanish > the Spanish string is spoken by the machine and then gets translated back to English > the resulting sentence gets printed out by a thermal printer, a camera takes a photo of the printed text and the text gets pulled out of the image using OCR. Both the camera view and the OCR output would be shown on a display > the resulting sentence is then spoken through the earpiece of phone 2.

To do:

– Reverse the signal flow when going from phone 2 to phone 1.

– Use audio thresholds to control the direction of transmission and properly route audio signals.

– Play hold music in phone earpieces while message is being processed.

– Improve OCR (crop photo, error handling).

– Implement autocorrect so that words mangled by Morse or OCR get converted to actual words before they’re passed off to the next step (Google search suggestions API).

– Code ringer behavior – if one phone is picked up, the other rings (we are already able to make the phones ring, we just need to program the Arduino to control them).

– Build/buy stands and mounts for each “station” (two phones, Morse, translation, OCR).

– Write an Arduino program that will control a light bulb progress bar (we already have the relays and wiring in place for this).

– Write a Morse code program that will convert the signals tapped out on the Morse key back to English (this isn’t a priority since we can fake it and use the English string from the previous step).

Classes

Introduction to Computational Media, Introduction to Physical Computing