Performance systems which enable performers to use already recorded sound as raw sonic material are another popular theme of NIME. Some home grown examples from previous NIME classes at ITP include Mark Argo’s Slidepipe, Taku Lippit’s Lupa and Michael Schneider’s AM Synthesis.
At times, a computer controlled machine becomes the object of the performance. Human performers necessarily intervene through control or programming of the bots, but at the moment of performance, all eyes are on the mechanics of the artificial musician.
Eric Singer is active directing LEMUR (League of Electronic Musical Urban Robots) in Pittsburgh, which has produced many performative bots, including the Tibet Bot, the Mod Bot, the Forest Bot and the Guitar Bot (pictured). The Yellow Drum Machine is another example of a different sort. It is literally a little robot that seeks things to drum on.
Haptic technology refers to technology which interfaces the user via the sense of touch by applying forces, vibrations and/or motions to the user. Haptics in instruments provides feedback for what is happening with the performance. The area of Haptic Musical interfaces is one of increasing research interest, with labs like ACROE-ICA doing in-depth research on haptic musical instruments.
It is now possible to create haptic instruments without a big budget or a research lab. Some examples of projects built people without access to high-tech research labs Collin Oldham's Cellomobo, Edgar Berdahl's Haptic Drum, and Hans-Christoph Steiner's StickMusic.
There are many examples musical applications of ‘reactive’ table systems that are intended to be played by multiple visitors and players. The reacTable by Sergi Jordà, Martin Kaltenbrunner, Günter Geiger and Marcos Alonso is one such interface, where players move tracked objects on a shape-sensitive table surface. These objects have additive, subtractive and manipulative effect on the sound produced based on color and shape of the physical object.
The Tooka requires two players - it cannot be played alone. The two players work together, developing a special coordination and intimacy. This instrument was developed by Sidney Fels and his colleagues at the University of British Columbia.
“We describe a new music controller, designed to be played by two players. As the intimacy between two people increases so does their ability to anticipate and predict the other’s actions. We hypothesize that this intimacy between two people can be used as a basis for new controllers for musical expression. Looking at ways people communicate non-verbally, we are developing three new instruments based on different communication channels. The Tooka is a hollow tube with a pressure sensor and buttons for each player. Players place opposite ends in their mouths and modulate the pressure in the tube with their tongues and lungs, controlling sound. Coordinated button presses control the music as well.”
As a musical interactivist at Interval Research in Palo Alto, Blaine led a team of developers in the creation of a collaborative audiovisual instrument known as the Jam-O-Drum. By combining velocity sensitive input devices and computer graphics imagery into an integrated tabletop surface, six to twelve simultaneous players are able to participate in a collaborative approach to musical improvisation. The Jam-O-Drum was designed to support face-to-face audio and visual collaboration by playing on drum pads embedded in the surface to create rhythmical music and effect visual changes together using the community drum circle as a metaphor to guide the form and content of the interaction design. The Jam-O-Drum is a permanent exhibit in the heart of Sound Lab at the EMP.
Body As Instrument
The idea of the human body, freely moving through space, now enabled to make the music play rather than dance to it, is a strong theme in computer enabled performance tools. Many early controllers have taken the form of a glove - as in Laetitia Sonami’s “Lady Gloves” - or Michel Waisvisz’s “The Hands” - which allow the performer, though wired, to freely move in 3D space and control the sonic output.
Tomie Hahn’s “Pikapika” is a special take on this, where actual speakers that create all the sound output are attached to the dancer’s body, as opposed to going through the house sound system in performance.
This is a common model of “new interface for musical expression”, building upon extra bandwidth that some musicians may have, allowing for more controls to be added. Also, using an existing instrument means that existing expert technique that traditional instrumentalists have already developed can be built upon - this is seen in the many violin-technique-based controllers - for example Dan Overholt’s Overtone Violin he presented at NIME05 in Vancouver.
For Neal Farwell’s piece “Rouse”, he developed a series of ‘interventions’ for the Trombone. One of these was a speaker-mute he calls the “Emute”, allowing for added variation in mute effects on that instrument.
Matthew Burtner’s “Metasaxophone” is also a good example of this - a saxophone retrofitted with an on-board signal processor and an array of sensors.
A primary motivation behind the Metasaxophone is to put signal processing under direct expressive control of the performer. Through the combination of gestural and audio performance tracking, employing both discrete and continuous multilayered mapping strategies, the Metasaxophone can be adapted for a wide range of musical purposes.
Ajay Kapur’s eDholak - an augmentation of a traditional Indian drum. This is his description of it:
The purpose of this project is to use technology to create a real-time instrument that models the Dholak. This Electronic Dholak (known as the EDholak) has digitizing sensors, custom positioned to traditional Dholak technique, which converts finger strikes to binary code which computers can recogize. These signals are then used to trigger real-time sound and graphics. We also wish to take advantage of the collaborative nature of the traditional drum and explore building a multiplayer networked controller. See an example performance