May 03, 2007
I gave my final Thesis presentation on Tuesday at noon at the Interactive Telecommunications Program at NYU. The whole thing came very much down to the wire. I ended up builing the whole presentation in Max/MSP and Isadora. The idea was to have it look exactly like a Keynote or a power point, but with the ability to build interaction and live video processing into the mix. Overall, I think it went well. The archive is here, but I've edited off the first few minutes of the screensaver here as a standalone:
April 12, 2007
I ran into Ilan on the train back to Astoria a couple nights ago and we started talking about various things: About how people in Chicago are too nice. About his inflatable for toddlers. About my thesis. As I was describing the project to him, I nervously realized that in fact, I had gotten it wrong. I would need two camera's per mask. I couldn't just use two wireless cams unless I wanted to split the signal pre-wireless and preemptively route it to it's own mask. I don't want to degrade the signal on the wireless cams any more than I have to. So. Two cams it is. Here's a test with two. For some reason the IR LEDs are still giving me some trouble. Apparently IR light is hard to diffuse. Try it sometime. Let me know.
Posted by andrew schneider at 09:09 AM
April 09, 2007
I've recently been working with a couple Triple Axis Accelerometer Breakout Boards - MMA7260Q - available from Sparkfun. The process of getting these TriAxs up and running on a breadboard was a cinch thanks to Tom Igoe's tutorial and Rob Faludi pointing out that I needed to take the sleep pin high in order to get the thing functional. I necessarily moved quickly to getting the TriAxs on to perf boards with an Xbee each. These will soon be incorporated into Experimental Devices for Performance so the implementation needs to be small and robust. They'll most likely get a good smashing around.
Working with the Xbee API has proven to be extremely reliable and low maintenance for my needs. It certainly cuts down on hardware. No external microcontrollers are used in this setup. The TriAx gives out a range of 0 to 3.3VDC (the input voltage) on each of its X, Y, and Z axis pins. I'm sending that along via the first three ADC pins on each Xbee. The example below shows a quick test with a Max patch that parses the Xbee packet and maps the values to a gridshape's rotation (X and Y) and position (Z).
I hesitate to post the video because there will most likely be more, better, and applicable documentation soon and because...well...the original background sound was completely uninteresting so I had to liven it up, but I know there are better uses out there for Herbie Hancock's Rockit, especially since this demo is so very dry. I hope to do this song justice one day, but until then here it is:
Posted by andrew schneider at 07:58 PM
April 08, 2007
And on it goes. I've finally integrated, albeit with gaffer's tape, all of the components of the FaceMask on to the bike helmet. A screen covering the mouth, a small wired camera,a video Ipod, two wireless custom perf boards, and a lithium-polymer battery to power the whole thing are now held in place with gaff-tape and rubber bands.
As briefly explained in the video, the idea is to have two of these devices with one of the perf boards on the opposite helmet, and vise-versa. This way you can "slap" your mouth on to some one else's face and they can do the same to you. The interaction is not limited to slamming your head against a hard object. It's just what is in my thoughts at present.
This new perf board is fairly simple. I've built a standard 5 volt circuit for the atmega8 chip then regulated the voltage further to 3.3v for the Xbee. There's a debugging LED on pin 13 of the atmel. The only other big thing on the board is a 3VDC DPDT relay. DPDT means double pole double throw. Basically I'm using the 6 pins as a manual video A/B switcher. The three blue terminal blocks other than the terminal block for the power, are wired to the relay. I've wired up two sets of two pins with signal and ground from two video sources - one from the video Ipod and one from the camera behind the screen which is pointing at the mouth. The other set of two pins is the signal and ground to the screen. I take the coil pin high to switch between contact with the two video signals.
I had a bunch of trouble after I got everything on the perf borad with the chip freaking out when the coil demagnetized, or magnetized to rapidly on and off. I thought perhaps there was some back voltage that was causing the problem so I threw a 3.3v Zener diode before and after the coil. No dice. Next I noticed my lack of any decoupling capacitors. I threw a 10uF cap before the 5v regulator, a 1uF cap after, and some on before and after the 3.3v regulator for good measure. This seems to have done the trick. I think the voltage was dropping when the relay was being fired.
The perf boards were a bitch to put together. In the course of prototyping the breadboards I also inadvertently fried three, count them three, Xbee's simultaneously by putting an unregulated 9 volts through their cute little blue bodies. All dogs go to heaven. I've heard the same is true for honey bees. We can only hope this extends to Xbee's. I'll order $60 worth of new bee tomorrow.
April 06, 2007
The face mask has been going through different iterations lately. I've repositioned the camera on the back, added new IR LEDs and diffusers and am now using a lithium-polymer battery instead of a DC converter for all the power. This contrast of the image is helped by the IR LEDs, which makes the viewing angle a bit better as well. I've also got my Xbee circuit set up on a perf board as explained in the video and as seen in the pictures below.
Last but certainly not least is the Max patch that I've been working on (screen shot / .sit). Compare this new patch with what I was doing in the old version. Many thanks to Luke DuBois for the help with efficiency of parsing in Max and to Rob Faludi for help with general Xbee packet information and for letting me borow his dongle.
Next up is getting more screen for the eyes, then duplicating everything and patching them together wirelessly. Three weeks to go!
March 26, 2007
...and seven weeks later we have a first post. I've tried my hand at meditation6. From Luke's description:
"this patch textures maps a cube drawn not as a simple primitive (e.g. using [jit.gl.gridshape]) but as a platonic solid. as a result, the texture is mapped across the entire surface of the object, not tiled on each face. The nice thing about platonic solids is that you can independently texture each face of the object by creating a jitter matrix for the texture that contains six tiled images." And our mission: "change this patch so that you composite individual images for each face of the cube (see the help patch for [jit.gl.plato] for how it works out.) you can use still images, movies, different effects on a similar movie, etc. you'll need a [jit.matrix] object at some point to do the composition, using the dstdim attributes to place images in different parts of the texture."
I think I've got a variation of a possible solution working, (download the .sitx here) however I still have some questions regarding the size of the platonic solid I am drawing to. How big is it? My source movies, which are 320x240, don't look too pretty. Is this due to running 5 movies and a live cam all at once? Is this just due to interpolation or lack of interpolation?
Posted by andrew schneider at 05:43 PM
January 22, 2007
Posted by andrew schneider at 01:54 AM