Category Archives: Panoramic

Panoramic Narrative

Camera Array

SEE VIDEO

Project Description: This is an experiment with interactive narrative. Many conceptions of interactive narratives ask the audience to direct the course or change elements of the story. These run the risk of breaking the audience’s spell as the action pauses, turning off writers who don’t want to learn how make decision trees, and bankrupting producers who have to shoot many alternate versions of each scene.

Using panoramic narratives allows one to always construct one linear flow but allow the user to construct a different story depending on where they focus. Writing for this would be more like writing for the theater but possibly with a greater feeling of immersion because the fourth wall is intact.

Technical Notes: This project was shot with an array of digital cameras. Synchronous frames from each camera were then stitched together. I did the automation of this process in Applescript and the stitching in QuickTimeVR Authoring Studio. The playback was programmed in Macromedia Director/Lingo and was originally an installation using a panning television set constructed for my earlier LampPost piece.

 

Space Shots

 

SEE VIDEO

 

Space Shots is an application for navigating virtual tours of real estate. There was a novel interface using synchronized floorplan views and first person perspective. Selection and rotation of nodes in the floor plan view changed the view in the QuickTimeVR file. Conversely, navigation in the QuicktimeVR was reflected in the floor view. I also developed an authoring environment for the quick creation of these CD-ROMs. This was a collaboration with Scott Whitney at Today’s Feature Production Company.

.

Lamppost

SEE VIDEO

I constructed a camera rig for panoramic time-lapse photography to capture a place over a period of time. I stood on the corner of St. Marks Place and Bowery for 24 hours while the camera rig continually grabbed QuickTimeVR scenes. I then built a television set that could do two things: 1. adjust the pan of the QTVR scene by rotating the set on its base; and 2. adjust the time of day through the channel selector. The user therefore controlled the view, both in time and space. This piece was included in the “Elsewhere exhibit at The Threadwaxing Gallery. 1997

Macromedia Showcase: Designers Studios QTVR

SEE VIDEO RODNEY OTHER

A tour of prominent design houses and studios across the country. It used a custom QuickTimeVR display that was included on the Macromedia Showcase CD-ROM. Among the featured studios, Rodney Greenblat’s was the most dense multi-nodal QTVR done to that point. I worked with Cathy Clarke and Lee Swearingen at Macromedia Creative Services for the Macromedia Show Case.

Technical Notes: The photographs were shot on film and digitized on to Kodak�s Photo-CD. They were stitched using MPW and displayed using Macromedia Director/Lingo.

Aeron Chair

See Video

This is a machine for taking a high-resolution interactive photograph of the soon-to-be released Aeron Chair by Herman Miller. I built this for Clement Mok Design and collaborated with a mechanic, a photographer and graphic designer to take thousands of still photographs necessary for creating an object movie. The software for displaying the resulting QTVR Movie is written in Director. The display software popped up details of the chair’s features as a user panned and tilted I the chair to examine it. Conversely, it spun the chair to display particular listed feature. This was the most elaborate and high-resolution object movie at that time. I spoke about it at Viscomm and it was widely written about including an ID Magazine award for best presentation design.

Technical Notes: The capturing rig used a Director interface, which controlled two large motors using Alpha Products cards and controlled the shutter of what was then a very unusual and sophisticated digital camera from Nikon.

Object Maker


At Apple I also experimented with object movies. Capturing an an object rather than a scene in an interactive photograph requires a more elaborate mechanism for moving the camera. I built a couple of rigs on my desk but eventually I was put in touch with John Borden of Peace River Films who had already designed such a mechanism for photgraphing Myan artifacts in connection with his important work on the Palenque project with Bank Street College. Joy Mountford and Mike Mills, my bosses at Apple, agreed to commission John Borden to actually build a large scale object maker. I participated the early design of this machine and wrote the software for displaying the resulting movie but the bulk of the design and all the construction was done through Peace River Films. The schematic above was created by John Borden. A person could sit in this rig and a few moments later the machine would create a virtual picture of their entire head. Essentially, this is still the same method used for QTVR objects. This was made into a booth and featured at an exhibit and paper at SIGCHI ’92. Several companies, including John Borden’s company, Peace River Studios, and Kaidan, turned this concept into a product. I worked with John Borden of Peace River Films, Mike Mills, Ian Small, Michael Chen, Louis Knapp and Eric Hulteen.

Technical Notes: The control and display software were both written in Hypercard. John Borden (now president of Peace River Studios) designed and constructed the rig

Apple Computer, Human Interface Group 1992

Paris QTVR

SEE VIDEO

The first series of panoramic images utilizing Eric Chen’s new QTVR stitching technique. All twenty scenes were also captured with sound. I collaborated with Lili Cheng and Michael Chen at Apple Computer Advanced Graphics Group in developing software and interfaces for controlling these pictures and synchronizing them with audio.

Technical Notes: The images were shot on film and then digitized on to Kodak Photo-CD. The panoramas were stitched using MPW The display software was written in Hypercard.


Dan’s Apartment

SEE VIDEO

An interactive television show that allowed viewers to virtually navigate through my apartment by speaking commands into their phone. This began as an experiment looking for better interface metaphors than the desktop. By putting it on television, it took on a strange voyeuristic quality. It developed a large cable following and provided me with my 15 minutes of worldwide attention. I did this work as a student at ITP, NYU.

Technical Notes: I shot all pathways through my apartment onto video and pressed them to a laserdisc. I built a box using a Voice Recognition chip from Radio Shack and connected it via the parallel port of an Amiga to AmigaVision software that controlled the graphics and the laserdisc.

Golden Gate Navigable Scene

As an intern for Mike Mills, the creator of QuickTime, I wanted to use QuickTime to depict space as well as motion. We went to the top of one span of the Golden Gate Bridge and shot an interactive panoramic photograph. I demonstrated this picture during John Scully�s keynote address at MacWorld and it went on to become very widely demonstrated at Apple and beyond. This was the beginning of a project called Navigable Movies that eventually became the product QTVR.

This was a collaboration with Michael Chen, Mike Mills, Jonathan Cohen
and Ian Small at Apple’s Human Interface Group under the direction of Joy Mountford.

Technical Notes: I built a rig using HyperCard and some relays to control a camera and take pictures.

Pavlovsk Palace

Traveled to Russia with a team from Apple Computer and The National Art Gallery to systematically recorded all the rooms of a famous summer Palace outside St. Petersburg.  I customized the rig that I made at Apple Computer for creating panoramic photography.  These were compiled and published on the QuickTime 1.6.CD as the first ever navigable scene, soon to be called QTVR.

I worked with Mike Mills, Mitch Yawitz, Eric Chen of Apple and Susan Massie and Kim Nielsen of National Gallery.

Technical Notes: This used a microcontroller from Radient systems, a pan and tilt head from Pelco and HyperCard software for control and playback. This configuration was documented, built into a toolkit by Apple and sold by Radiant.