Archive for November, 2010
Last spring I worked on a play. It was an adaptation of Samuel Delany’s epic science fiction novel, Dhalgren. This book is crazy, and the play was crazy – I did the sound design. This is the kind of book that sticks with you (the wild dense prose, the imagery, the….extremely detailed pornographic sex), and there were many things from the book that weren’t realized in the play and I’m holding onto them. The book is set here – well, in a city somewhere in America – after an unnamed disaster has taken place. The city is a wasteland, but people are still living there. They live for free in parks, or squat in apartments where nothing works.
Gangs, known as the Scorpions, run the streets. This is actually the element I’m thinking about. Members of the Scorpions wear a projector necklace. When you press the button on the projector a holographic animal surrounds your body. It’s kind of like digital warpaint. One of the characters is known as dragon lady, because her projection is a dragon. One of them is a baby dinosaur – which I love. One of them doesn’t work correctly and looks like an amorphous blob. I find it weird that I can’t find an image of this somewhere. I feel like it’s one of the most memorable images from the book – gangs of fierce, oversized, holographic animals walking through the streets.
It would be pretty easy to have a static image follow a person with an LED using brightness tracking, but I wonder how far I can take it using what we’ve done in ICM. I’m imagining fixing LEDs to different joints and body parts (wrists, elbows, forehead…) and using a camera to track their positions. Then using their positions to map projections of the animal body parts to the movement of the performers body parts. This video inspired me:
They’re using individual projectors – and are doing something totally weird, but it’s the same idea.
If I could get my hands on an XBox and a Kinect, I’d love to do something like this:
I’m sure that motion tracking with a Kinect would work wayyy better than LED tracking, but it’s cost prohibitive at the moment.
As for the projection, I’d love to experiment with projecting on fog. I also have a lot of reflective film so I might look into the 3D effect you can get using a two way mirror (this would probably look awesome, but I’d need a pretty big piece of glass). Projecting directly onto the body is possible, though i’m not in love with the shadow you make on the screen. Rear projection might be the easiest thing – it would look just like the Helicopter Boyz video.
This will be a performance piece/video sculpture. Hopefully it will look badass and not totally lame.
For our media controller project, Becky, Roy and I created a beautiful purple chair that transports the sitter, and anyone in the room with them, to a magical land.
A pressure sensor embedded in the seat of the chair controls the playback speed of an animation in Processing. When you sit, it plays forward – the chair grows into a tree. When you stand, it plays backwards, returning to it’s original appearance. Becky made the animation in Flash.
We made capacitive touch sensors from aluminum foil and attached them to the arms of the chair. The sensors use the capacitance of the human body in a circuit. When you place your hands on the arms of the chair, the value returned by the capacitive sensors is higher than when untouched. While skin is in contact with the sensors, a counter variable increments in Processing. This counter controls the volume of an audio file – the longer you sit in the chair, the louder the audio environment becomes. When you remove your hands, the sound slowly fades. I made this soundscape in Protools and used Soundhack for some signal processing. I used field recordings, some convolved with chords from the Ravel Piano Trio to make the tonal elements, and mixed these with a recording of a Pine Forest filled with birds.
Roy made a housing that attaches to the bottom of the chair where the Arduino lives, with wires neatly packaged.
It was important to us that the physical interaction be organic. That is, this event happens when you use the chair as a chair should be used. You don’t have to press buttons, or turn knobs, you just have to sit and wait.
The animation is played over a live capture of the chair and the sitter – both feeds partially transparent. In our first staging, the video and animation were projected on a screen next to the actual chair. In the future, we want to experiment with projecting directly onto the chair and sitter. The documentation shown here is a movie captured from the Processing sketch. The lighting in the room we were working in made it difficult to capture the effect of the projection.
I’m a little late documenting this, but I’m thinking about it again because I would like to fully execute the idea for my final project in Physical Computing. The original idea was to make a better vocal harmonizer that you can play with your fingers the way you play any other instrument. You can control the interval between the note you are singing and the shifted note by bending your fingers. I made a glove and intended to use flex sensors to do this, though I didn’t get to fully realize the physical aspect of this project due to late ordering.
I did get far enough to control the interval of the shift with a potentiometer communicating through my Arduino to Max/MSP. I used the Gabor library for real time frequency analysis and synthesis. Here is my Max patch and all its subpatches:
The main patch takes an audio signal from a microphone, and serial data from the potentiometer. The serial data is sorted to determine how to shift the pitch of the audio signal. The shifted and original audio signals are played back in real time.
The serialread subpatch takes data from Arduino – the analog values determined by the potentiometer (between 0-255) – and passes it to sortit.
The sortit subpatch takes the analog values from the potentiometer and sorts them into 10 ranges. Each range corresponds to a specific interval. The Gabor synthesis function takes intervals in cents (1 semitone~100 cents). The ranges I specified shift the incoming signal up or down a minor third (300 cents), major third (400 cents), perfect fourth (500 cents), perfect fifth (700 cents), or major sixth (900 cents).
The analysis and synthesis functions are included in the Gabor library. They performs a frequency analysis on the incoming signal, and synthesize a new signal with the attributes of the original, but with a shifted frequency spectrum.
It’s not perfect, but here’s what it sounds like now: