Project Description: A podium to be used as musical instrument. I created a surface onto which users could place various found objects. The identity, position and orientation of the objects, as sensed by a video camera, changed the parameters of a physical model in the VL1 MIDI synthesizer. The notes played by the synth were determined by another musical instrument. I worked on this with Danny Rozin, Geoff Smith and Dominick at Interval Research, Expression Group
Technical Notes: I used Macromedia Director/Lingo with Danny Rozin’s Video XTRA to track the objects. I used HyperMidi to communicate with the MIDI synthesizer.
Project Description: Participated in a team “Pressure Project” to create networked nodes for musical sensors. The idea was to build black boxes where many different types of sensors can be easily plugged and played without much technical knowledge. My task was to develop the software to configure the nodes.
I worked with Geoff Smith, Bob Adams Chris Sung and Perry Cook.
Technical Notes: The boxes were build using PIC Microcontrollers with proprietary network protocol. The configuring software was written in Macromedia Director/Lingo.
Project Description: I adapted the web musical instrument from the Interval Expressions group into a kiosk for the Experience Music Museum (which was then under construction) in Seattle. It was displayed at the Tacoma Art Museum and the Los Angeles County Art Museum.
I worked with Amee Evans, Geoff Smith and Tom Bellman under the direction of Joy Mountford at Interval Research.
Technical Notes: I used Macromedia Director/Lingo and Hypermidi for connecting to hardware synths and the QuickTime Musical Instruments.
Project Description: I was a team member in the development of a complex MIDI musical instrument in the form of a simple broomstick. I developed the sensors and software of the first prototype. Artist Laurie Anderson in her Moby Dick performance later used the device, in its fourth iteration.
Bob Adams was the lead of a great team including Geoff Smith, Michael Brook and John Eichenseer Interval Research’s Expression Group
Technical Notes: I built a MAX patch and a connection to the National Instruments MacAdios board that was connected to various pressure sensors on the stick.
Project Description: I headed a project team looking at building musical instruments within the constraints of desktop PCs. The team developed numerous iterations of many instruments.
I used a kaleidoscope application written by Kate Swann, which took input from the computers microphone and gave the desktop different abstract patterns depending on the type of noise in the environment. I then wrote software for a 3-D dancing fork which generated its own appropriate accompaniment. The most successful piece was called the Web (this was before the WWW was very popular). A beam traced radially around the web like an air traffic controller’s screen. Users could add eggs into the web, which would create a sound as the beam ran across them. Users had controls for changing all aspects of the music including pitch, volume, duration, tempo and instrument. Users could also save out compositions. There was also a special interface for more accomplished musicians to create sound palettes for the web.
I collaborated with Joy Mountford and Geoff Smith, Andrew Hirniak, James Tobias, Amee Evans and Tom Bellman at the Expressions Group, Interval Research.
Technical Notes: Most of these instruments were created using Macromedia Director/Lingo and Hype
Project Description: I adapted the Web Musical Instrument from the Interval Expressions group for a live performance. A control box with a physical interface allowed for quick multiple changes by a performer not possible with a single mouse. I also slaved the system to a central sound server so it could play with other instruments from the Expressions group.
I worked with Geoff Smith, Bob Adams, Joy Mountford, Michael Brook and John Eichenseer at Royal College of Art for Interval Research
Technical Notes: I used a Basic Stamp for the physical interface, Macromedia Director/Lingo for the interface and connected to a Sample Cell card for the sound generation.
This was a screen-based musical instrument. It used pre-rendered 3D graphics to depict a human hand twisting and turning in space and striking notes in that space. When the hand intersected one of the growing bubbles, the instrument and pitch sets of the notes changed.
Technical Notes: This was created in Macromedia Director/Lingo using a HyperMidi XCMD. The graphics were pre-rendered in Swivel 3D. The sound was created on a Roland U220 synth.