Headphones are objects that mediate public vs private space. To explore the mediated relationship between these two spaces, I modified an existing set of headphones to allow a user to record external sounds while simultaneously listening to internal music.
When I listened to my first headphone recording, I was generally disinterested. The external auditory information I missed while listening to music on my headphones was the predictable sounds of cars, phones, people talking, wind, church bells, etc. This is not information I am unconscious of. Quite the opposite, I am very aware of the general city soundscape and my conscious of choice to block it out when I listen to my headphones. But I gave myself time really study the recorded information and dig further -- data mining. The interesting information is not the content of sound, but the rhythms. When we move through public space, we receive a wash of sound at varying frequencies. Each frequency range is dotted with peak points, mapping a rhythm/tempo. I found that wind primarily occupies and upper frequency range, people and cars dominate a mid range frequency, and my walking pace (heard in the sound of air passing up and down next to my headphones) remained consistently in a lower frequency range. When I analyzed the numbers of these frequencies closer, I was able to roughly map three tempos. Granted, these tempos do not always follow a obvious pattern, but there is certainly a rhythmic sensibility. Conscious or not, we move in, around and with these natural rhythms. When we listen to our headphones, we are more than just blocking out noise, we are modifying natural rhythms: the rhythm of the city, the rhythm of our body.
One detail I found particularly curious was that regardless what songs (and their varying tempos) I listened to, the pace of my walking never really varied. This observation led me to research the concept of internal rhythm. When we move in, around and with the rhythms of the external world, is our walking pace a metronome of our own internal rhythm? Therefore, could it be said that the rhythm of music, a human art, is based on a shared human clock? These questions can be explored through the concept of tactus, the steady pulse created by Renaissance composers to achieve an orderly, rhythmic music-system. Roughly the rate of the human heartbeat, and approximately equal to the pulse of a man breathing normally (between 60 and 70 times per minute), tactus is a term that has been used to denote a person's expressive internal timekeeper. In The Cognition of Basic Musical Structures, David Temperley states, "A metrical structure does not just consist of several levels of equal strength and importance...there is generally one level, the tactus, which corresponds to the main "beat" of the music. There is psychological evidence that the tactus level serves a special cognitive function. In performance, there is less variance of beat intervals in the tactus level than in other levels, suggesting that the tactus serves as the internal "timekeeper" from which other levels are generated. For a metrical model, then, it is important not only to identify all the levels of the metrical structure, but to identify the correct level as the tactus." (52) If we believe that the tempo of all songs are fundamentally based on an internal human clock, then listening to any music through our headphones is choosing to amplify our natural, internal rhythm within the context of the external world. Indeed, more than blocking out noise or modifying the natural rhythms of the external world, we are literally amplifying our private space -- our internal soundscape.
Musical rhythm usually operates within a recursive temporal framework such as a (periodic) beat or a (metered) measure. Therefore, it makes sense to visualize tactus-based rhythm as a cyclical concept. Using Max/msp Jitter, I developed a patch that graphs peak values of specified frequency ranges in the animated visual of a hand drawn circle. The current graph consists of three circles: the outer circle displaying low frequency range, center circle displaying mid frequency range, and inner circle displaying high frequency range. drawing circles.mov
As I work on this project further, more attention will be placed in the following areas:
1. The headphones. Currently, only a recorder with an internal preamp can be used to record the audio signal from the condenser mics in the headphones. I would like to build a preamp for the mics so that any recorder can be used. I would also like to build a small mixer so the user of the headphones can monitor/adjust the internal/external sounds being received.
2. The research. There are many different variations (locations, levels, activities, etc.) that can be recorded and analyzed to explore conceptual ideas of headphone mediated space.
3. The mapping. The basic structure of the patch is pretty defined, but I need to continue working on the frequency analyzation in Max/msp Jitter. I also need to input higher quality visuals for a better graphic result.