-->

amy rose khoshbin
Michael Craig Clemow

Semiotech

To share our exploration of the nature of perception and its relationship to interactive performance technology, Semiotech has created a series of participatory demos and put some of the narrative-based musical technologies we've developed on display.

http://semiotech.org

Classes
Networked Objects,New Interfaces for Musical Expression,Rest of You,The Softness of Things: Technology in Space and Form


Through two simple demonstrations using a physical—if metaphorical—model of the “smallest unit of perception,” we will first show how perception relates to interactivity. And we will show visitors to our booth that, in fact, many of the pieces in the show will operate on this principle.

Technology mediates perception in powerful and often abstract ways. A thorough exploration of the process of perception and its role in making meaning can be a valuable experience for designers of performance tools intended to enhance the communicative capacity of performers.

As an example, we are displaying some of the performance tools we have created as well. The Shadow Puppet Sequencer and Bird Mask Vocal Controller demonstrate Semiotech's varied approach to performance technology design.

The Shadow Puppet Sequencer Table demonstrates an approach to narrative construction that is intimately coupled with sound design. The table provides a translucent surface upon which cutouts can be placed and moved, creating shadows that double as musical notes. The relationships between the shadow objects are thus spatial as well as musical. These relationships can be manipulated by moving the objects on the surface of the table. The narrative is propelled by changing the spatial--and thus musical--relationships of the objects over time.

The Bird Mask Vocal Controller illustrates an approach to design wherein the activation of the technology is built directly into the costume the performer wears. The performer's mask contains a microphone and controls for voice effects and sampling. This mask maximizes performer mobility, as well as marries robust controls with movements natural to the character, without compromising aesthetic decisions or modularity.


Background
In breaking experience down into the smallest perceptible unit, we are trying to create a framework to understand interactivity and perception. Boiling our experience down to the smallest perceptible unit is important because it defines a limit--if the unit was any shorter in time or any smaller in space, you would not perceive it at all. At this limit, the meaningful content of our experience is at a minimum, the physical capabilities of our senses are tested, and internal states, such as whether or not we're paying attention, make their biggest impact on our experience. Here is the realm of perceptual acuity--how our senses register input. We measure things in this realm in order to study the mechanisms of perception.

But the smallest perceptible unit also has correlates here in technology--the pixel, the sound grain, the tiny steps that motors make. And also the smallest movements, the tiniest events--marks on paper, notes in music, lumps of clay, strips of newsprint--these are also related at this level.

As we ascend the scale of complexity, we amass more and more of these small units over time and across space. We begin to associate them with each other and parse them into symbols. Through this process context emerges. Context is dependent on our ability (and choice) to associate and group symbols. At the level of the smallest perceptible unit of experience, this act of association may be a reflex. As we ascend in complexity, however, the act becomes increasingly intentional.


Audience
Anyone interested in interactive technology, perception, and/or performance.

User Scenario
For the first of the demonstrations, we take a physical model of the smallest perceptible unit of experience (an experiential element, or exel) and connect it to a visually and aurally amplified version of the user. When the user approaches the booth, they see a live video feed and hear an amplified version of what they are experiencing around them. Upon picking up the model of the exel, the video feed zooms into one pixel of the image and the audio isolates one frequency of the sound to mirror the smallest unit of that user’s audiovisual experience. The model changes color with the pixel on the screen and what seemed like a random mapping is now clear; the pixel on the screen is the pixel in the user’s hands.

For the second demo, we deal with the concept of transference between senses. Can we hear images or see sound, for example? To demonstrate this transference, we are using the model exel to again represent the smallest unit of an interactive experience. The user wears a sensor that tracks the expansion of their chest as they breathe and this is reflected in a light getting brighter and dimmer, a sound getting higher and lower in pitch, and a vibrating motor being turned on and off. This represents the transference of bodily input into the visual, aural, and tactile realms at the most basic level.

Two performance tools that demonstrate these principles will also be presented to allow users to play with and explore the modularity of Semiotech’s instruments. The Bird Mask Vocal Controller will be set on a pedestal, available for users to wear and play with, hearing their vocal manipulations in a set of headphones. The Shadow Puppet Sequencer will be set up as its own table with headphones next to the Bird Mask Vocal Controller, to be played by users as they construct their own mini-narratives with different shapes, or simply create beats by moving objects over the table.

Implementation
The exel model for consists of a 4"x4" translucent plexi-glass box that houses the circuit inside it. It can be communicated with via a host program on a nearby laptop. The communication is over XBee radio. The host program for the show consists of the two demos described in the User Scenario section. The model is designed to demonstrate simple output types (sound, light, vibration) as determined by the host program. Different situations may require a different input/output mapping, so the Firmata protocol (slightly modified to operate over the XBee) was used as the communication protocol and firmware for the device. This implementation allows the host program to determine the behavior of the model and avoids having to reprogram the device for each situation.

The Shadow Puppet Sequencer Table is essentially a table with a translucent plexiglass top, a light above it, and a camera below. The camera's image is scanned from left to right using custom software (written in C++/openFrameworks). The shadows are detected and this information is sent to a custom software synthesizer (written in ChucK).

The Bird Mask Vocal Controller is implemented using a wireless lav microphone, custom wireless circuit design built into the beak of the mask (using potentiometers and switches) and a separate module worn on the head, and custom digital signal processing software (written in Pd). The modular circuitry can connect to a variety of costumes and/or components, it includes a Lilypad XBee, Lilypad Arduino, Lithium Ion Battery, and IDC connector to connect to the bird beak. The mask itself is hand made using colored felt fabric, interfacing, and a variety of components.

Conclusion
Not to put too fine a point on it, this process has been a struggle. Conveying the concepts we are exploring has been difficult in the past, which is what drove us to this project's current manifestation. It is very important for us to be able to demonstrate our ideas to a wider audience in a clear, direct, and concise way. One lesson learned was that jargon is a barrier to understanding: the simplest way to say something is often the best. And to experience the concept physically and see it in action via the demonstrations is important and crucial to its understanding.

We also learned how to develop performance tools that use our specific paradigm as well as being modular enough to use in a variety of different settings. Through the creation of these tools, we've realized the power in creating custom instruments and will be continuing this pursuit into the future.