The above screenshot (click for a larger version) shows the primary components of my piece as they currently stand. The ChucK patch is open on the left; my hacked version of Frotz is in the center; and on the right you can see the ChucK status window, which is showing some of the data being passed from the interpreter to the ChucK patch.
Here is what the Frotzophone currently sounds like (3'31" 192kbps MP3). The game (and gameplay) that generated this sound comes from the very beginning section of Zork I. Appreciate this noise, for it is dearly won: I had to learn how to handle strings in C. Not fun.
How it works
Last week I explained a method of tracking objects in the Z-Machine—specifically, keeping track of when one object is "attached" to another. This happens whenever the player object is attached to a room (i.e., when the player moves to a room) or when an object corresponding to an in-game item is attached to the player (i.e., when the player picks up an item) or when a non-player character does either of these things.
The Z-Machine interpreter sends an OSC message whenever such an attachment takes place, reporting both the parent object and the child object. The ChucK patch checks to see if the parent object has been seen before; if not, it associates that parent object with a note from a pre-determined scale. That note is then (and on subsequent appearances of that parent object) added to a list of notes for that child object. The ChucK patch plays the list of notes for each child object simultaneously and in a loop; the effect is sort of an aural history of how objects have been related to one another.
I feel confident that I'm closing in on both the technical and aesthetic aspects of the piece. The mapping of notes to relationships between in-game objects feels right, and I like the way it sounds. I still have a long way to go, though. Here are some pretty significant challenges and issues (just off the top of my head):
(1) I need a more effective way of explaining the piece and how it works. I feel like the underlying mapping is actually kind of intuitive (in a strange way), so I should be able to explain it without resorting to a full-blown exposition of the history of interactive fiction. I like talking about the technical details, but I need to be better prepared to explain it succinctly to less technical audiences.
(2) I think I also need some kind of visual representation of what's going on inside the ChucK patch—for the audience's benefit and my own. Fortunately, the data I'm getting out of the interpreter lends itself to this kind of display. I might be able to work up something real quick in Processing.
(3) The current mapping doesn't leave much room for the piece to grow and breathe. The sound gets almost overwhelmingly intricate after only a dozen or so moves; the different threads are hard to pick apart. I have a few ideas for getting around this—stereo separation (or multichannel audio!) is a good start. I'm also considering having the loops for objects gradually grow quieter and louder in response to how often the object is seen. Ideally, though, I'd like to map the timbre of the sound to the data in some way.
(4) The piece isn't as expressive as I'd like it to be. When I imagine my performance, I feel a little bit hemmed in. I'm considering maybe adding some extra-diagetic commands to the interpreter that would allow me to make changes to the sound of the piece—something to change the tempo, maybe, or to solo/mute loops for particular objects.
Hopefully I'll have made some headway on one or more of these issues by Wednesday.