If you have a web cam you can got to http://itp.nyu.edu/dano/face to hover over a web with other people looking at that some page. Your video image follows your cursor around the page tuning in other video images more sharply when they are close to you on the page. This needs a Flash 10 plugin.
Breathing is surprisingly underrated. I made this example for students in my Rest of You class to be able see their breathing on their cellphone and to log it over the course of a day. In the course of this I live the dream of wearing a bra on the outside of my clothes during class. Continue reading Breath Bra
It is kind of a kick to see your heart beat. Polar is a company that makes lots of stuff for athletes to watch their heart beat. The data usually goes to something like a watch and i think it is pretty hard to get your own hands on the it. I made this example for students in my Rest of You class to be able to use their heart control somthing or just be able to see the same data in their own visualization.
Project Description: This was an experiment to bring the video tracking and video recognition that is very popular at ITP into the realm of very small and inexpensive microcontrollers. It is a continuation of a project that made use of a PIC microcontroller and a Quickcam. With the introduction of very cheap CMOS cameras and very fast SX microcontrollers, this tool could have greater speed and resolution. At the same time a group at CMU developed a commercial kit which uses the same components (better engineering).
Technical Notes: I wrote the software in C for a SX Chip and build the circuitry to connect to a CMOS camera from Omnivision.
Project Description: Merpy.com is a children’s Website which features the animated, musical, and interactive stories of several characters by Marianne Petit.� This project set out to build life-size puppets of the Web characters.� The user would face the puppet theater and through manipulating a traditional pop-up advance the story.
Technical Notes: I worked with Marianne to create the life-size motorized puppets.� I built a microcontroller to control them and wrote an authoring tool in Macromedia Director/Lingo for creating sequences of animations for the puppets.
Project Description: I programmed an application for testing response latencies as a measure of accuracy. I worked with Dr Brian Corby, Fordham University.
Technical Notes: The application was written in Director with Xtras added to achieve millisecond timing and to port logged data into standard statistical analysis software
3D sensors to measure and display an individual’s ability to maintain their balance at the Hypocrites Project, NYU Medical School.
An application for analysis of home finances for Wall Street Journal, Interactive Edge.
This was a collaboration between the ECHO (the virtual salon/on-line service) and the SCI-FI channel. During the weekly airing the sci-fi cult classic “The Prisoner” series an ECHO chat session scrolled� in a small window at the bottom of the screen. People from around the country telneted in to become part of a national virtual peanut gallery.�
This was a collaboration with Jamie Biggar, Sharleen Smith at SCI-FI Channel, USA Networks Online and Stacey Horn at Echo Communications.
Technical Notes: This was written in Macromedia Director using a Serial Xtra to connect directly with a modem and emulated Telnet.
An interactive fiction where the dramatic arc was customized to uniquely frustrate the user based upon previous actions done at Apple Computer, Human Interface Group.
Project Description: This is an experiment with interactive narrative. Many conceptions of interactive narratives ask the audience to direct the course or change elements of the story. These run the risk of breaking the audience’s spell as the action pauses, turning off writers who don’t want to learn how make decision trees, and bankrupting producers who have to shoot many alternate versions of each scene.
Using panoramic narratives allows one to always construct one linear flow but allow the user to construct a different story depending on where they focus. Writing for this would be more like writing for the theater but possibly with a greater feeling of immersion because the fourth wall is intact.
Technical Notes: This project was shot with an array of digital cameras. Synchronous frames from each camera were then stitched together. I did the automation of this process in Applescript and the stitching in QuickTimeVR Authoring Studio. The playback was programmed in Macromedia Director/Lingo and was originally an installation using a panning television set constructed for my earlier LampPost piece.
Project Description: I worked with a team to build a sculpture that would also serve as an evolving hypertext document about Italian Futurist theory. The sculpture was built from wooden balls for nodes and metal rods for connections. Each node had a bar code which, when scanned, would trigger the appropriate text on a computer screen. In addition there was a bin of nodes and rods for a user to add to the sculpture. They could write a comment, print a bar code and add it to the sculpture.
I collaborated with Kate Swann (future wife) and Steve Marino.
Technical Notes: I wrote this in HyperCard with a serial XCMD to connect to the bar code scanner.
Project Description: I made a robot that was controlled by the direction in which a person would lean. A camera was attached to the robot and the video was fed back into a monitor in front of the user. A video camera was mounted on the ceiling above the user’s head. If the user leaned forward, the robot would move forward. By leaning to the right, the robot would move to the right.
Technical Notes: The video tracking was accomplished using the Mandala system on the Amiga. The robot control was accomplished using AmigaVision and a custom box full of relays. The robot was constructed from a radio-controlled car.