Category Archives: Television

First Web Browsings Turn Up YORB

This historical document shows that our YORB figured prominently in the very earliest Web browsing:
Screen shot of Tim Berners-Lee’s browser editor as developed in 1991-92. This was a true browser editor for the first version of HTML and ran on a NeXT workstation. Implemented in Objective-C, it, made it easy to create, view and edit web documents. Adding a new hypertext link was a breeze!  In the center bottom of this screen shot  see the “Welcome to YORB” and write up about the program.

Xena Footnotes

This is a chat environment that allows fans to congregate around their favorite part of the Xena television show. This interface differes from movie SpaceTime Chat because the clip is broken down into segments which are depicted by the bars of the bar chart. The height of the bar maps to the number of comments for that segment. Comments are distributed synchnonously to other people logged on concurrently and also stored for future users. The software works for all types of streaming media. Special moderators can log on and edit chat. Chat is automatically censored for profanity. There is also a sniffer which detects the appropriate version of this clip to play for the user’s connection speed. I worked with Sharleen Smith, Yaron Ben-Zvi at Oxygen.

Continue reading Xena Footnotes

Internet Television Station

A Series of experiments in combining television and internet technologies.

1) Roaming Close Up

This is a client for viewing streaming video. First a high resolution still image of the studio background is displayed. A small rectangle of full motion streaming video of a zoomed in shot from the studio camera is then superimposed on the wide shot. The rectangle of video moves around on the client’s machine depending on what the camera in the studio is aimed at. Another version had the video leave a trail so that the background would always be updated by the video passing over it.

2) Video Cursor

This is a software system for broadcasting video commentary about very precise part of web pages. This software allows one person to stream an image of themsleves talking in a small window that floats above the web pages of a large audience. The software allows the speaker to move this small video window to a particular area of the web page and have it move to over that same part of the web page of everyone in the audience. It is a little bit like a video cursor.

3) Real Person, Virtual Background

This software connects the actual camera in the studio to the virtual camera in the 3D environment. This allows cromakeyed in backgrounds to move realistically as the camera pans in front of them. As the camera pans right to follow the character, the background pans left.
4) Real Head Virtual Body
This software streams video of person in the studio into a virtual world display on the audiences machine. The video is typically texture mapped on to the head of a character in the virutual world. Movements of the person’s head in the studio are tracked using a combination camera/orientaion sensor bicycle helmet and are passed over the network to control the movements of the character in the virtual world. A 3D chat interface was also part of this project

Technical Notes: These were done using shockwave and javascript on the client side, Director applictions for studio control, Java for relaying commands and either Quicktime or Real for streaming video.


Puppet Long Underwear

This is a glove that senses hand movement and controls a puppet over a live Internet broadcast. A microphone input is also used to control the puppet. The puppet control information is delayed and synchronized with the live video stream.

This was inspired by the work of Sarah Teitler & Jade Jossen.

Technical Notes: I used flex sensors connected to a Basic stamp to sense the hand movements and Geoff Smith’s Director Xtra for sensing the microphone volume. I programmed both the broadcasting module and the client module in Macromedia Director/Lingo outputting to Shockwave. I used the Multi-User Server for synchronous networking and the Sorenson Broadcaster for the live streaming video.

BubbleVision

Description: This is a system for television producers to synchronize events on Web pages with events on television. The tests of this system were very successful and are being developed to accompany DoubleDare which is a game show on Nickelodeon.

This was a collaboration with Shannon Rednour, Ray Cha, Evan Baily and Jason Root at Nickelodeon, Viacom Interactive Services

Technical Notes: I wrote the software in Macromedia Director/Lingo outputting to Shockwave. I used the “Multi-user Server” for the synchronous networking.

Trivia/Polling Web Game Engine

 

Project Description: This system allows television producers to easily create games for the Web that compliment their on-air programming. The system is comprised of several modules. The first is a web interface for writers to create questions for the Web audience. The second module is a web interface for producers to arrange questions into a timeline for a show and to schedule the question playback on the web. There is also a centralized SQL database for storing all the questions and shows. The third module, or web interface, is for the producer to play back the show on the web and keep score. There is also a bot in the playback engine that will automatically choose the best question within a range set by the producer, so the system can run shows continuously. Finally, the last module contains an interface for the end-user that displays the questions, the scores, and the chat of other contestants and relays the users’ answers to the playback engine.

I worked with Ethan Adelman at Oxygen media.

Technical Notes: I wrote all the interfaces in Macromedia Director/Lingo outputting to Shockwave, the middleware in Perl and the database is in MS-SQL.

Dunk Yaron Catapult

This project allows Web users to control elements in a live television broadcast. In this case, the Web audience can activate a catapult that hurls actual objects on to television studio stage.

On the home television we see a young man sitting in a dunk tank reading poetry. When a web user successfully aims at the dunk lever, he is dunked. This is a segment in a late night circus called UTV.

This was a collaboration with Sharleen Smith Yaron Ben-Zvi, Jed Ela, Ruel Espejo and Wayne Chang of Oxygen Media, Convergence Lab

Technical Notes: I wrote both the Web and producers’ interface for picking contestants in Macromedia Director/Lingo outputting to Shockwave. I used the Multi-User Server for the networking. I built the electronics and programmed in Basic the microcontroller that senses the motors in the catapult.

Big Mosh

SEE VIDEO

This is a prototype for an interactive television show where the interaction is on the scale of a building or an entire neighborhood. For instance, the Upper West Side could play the Upper East Side in a game of volley ball by placing a camera on Central Park South can be aimed northward to include a view of Central Park West on the left of the screen and 5th Avenue on the right of the screen. People living within view of the camera can turn on and off their lights to register with the software. The image from the camera is overlaid with a virtual object and shown live on television. People in their homes can push the object back and forth across the park by simply turning on and off their light switches.

Technical Notes: For the installation I used a poster of some buildings and a wall full of light switches. I wrote this software using Macromedia Director/Lingo and Danny Rozin’s Track Them Colors Xtra.

Mind Probe On-Air

SEE VIDEO

Project Description: I served as the Chief technical architect in the design and implementation of a simultaneous live television and web-based broadcast of a game-show format program utilizing telephony as a controller. I developed a real-time synthetic virtual studio that was controlled by various motion sensors attached to actors. I participated in the concept development and collaborated with 3D designers, actors, writers, and television producers to create this prototype.

This was a collaboration with Der Hong Yang, Matt Ledderman, Sharleen Smith, Jamie Biggar, Leo Villareal and Tracy White.

Technical Notes: The main game control was written in Macromedia Director. The face tracking was done using Danny Rozin’s Xtra for Macromedia Director and custom headgear. The DTMF decoding and call progress were done using pci cards. The virtual world was done created and rendered in Cosmo as VRML. The control of the VRML was done as a Java Applet. All networking was done using TCP/IP socket connections.

Red Booth

RED BOOTH

Project Description: We constructed a booth that was transported to high schools all over the country. In the booth students viewed rehearsals of a television episode. The video paused occasionally to ask the students questions and record video of their replies. Eventually the replies were incorporated into fully produced show that aired on NBC.

I collaborated with Karen Cooper and Tracy Johnson.

Technical Notes: This was written in Macromedia Director/Lingo connected to a video capture card and controlling a Pioneer laserdisc player using a serial Xtra as well as an 8mm Video Camera using a control-l Xtra.

Dan’s View

I made a pan and time-lapse photograph over a period of 24 hours out a window. The television audience could adjust, with their touch-tone telephone, the angle of pan or the time displayed. People at home could then replace any particular time or pan angle by sending in a computer graphic across a modem.

Technical Notes: The television interface was programmed using CanDo software on the Amiga attached to an Alpha Products box decoding the touch-tones. It was an A-Talk 3 script accepting the transmitted graphics files.

Yorb

SEE VIDEO

Project Description: This was an investigation of interactive television where the viewers create the programming content instead of choosing between more professionally produced alternatives. Urban design, architecture and interior design become interface tools as Manhattan cable viewers were invited to navigate around a virtual world using the buttons on their telephone. Within the world viewers encountered pictures, sounds and video that had been sent in by other viewers using modem, fax, telephone and Ethernet. The messages seen on TV are also available for distribution over these networks. I conceived, designed and programmed this automated television program. This work was widely written up and presented at numerous conferences such as Imagina in Monte Carlo and the New York Interactive Association. It aired three nights a week on Manhattan Cable Television and was sponsored by NYNEX. It became a showpiece for the department.

I worked with Red Burns, Lili Cheng, Nai Wai Hsu, Eric Fixler and about a million others.

Technical Notes: V.P.L.s virtual reality software on an SGI machine was used to render the imagery in real-time. The SGI machine was located at the Medical School and the video output had to travel across Manhattan making use of several different technologies with an Ethernet connection going back to control the SGI. The system also made use of a Video Toaster on an Amiga, for mixing video and various boxes for telephone voice and touch-tone telephone input. A Macintosh, running HyperCard was the main program for serving up media.

Window Shows

SEE INSTRUCTIONS, RUBBERHEAD OR HEALING HAND

These were a series of shows that aired on Manhattan Cable Television using touch-tones to direct the video. The first, the Healing Hand, asked people to hold hands via the television. People could call in and have their voice passed on air to give testimony. The second, Marianne Rubberhead was another healing experiment, but viewers also had an opportunity to rotate the orb and find out more about the oracle. Both of these were done in collaboration with Marianne Petit. Interactive Television Tutorial walked viewer through the steps necessary to turn off their television set. Viewers could use their touch-tones to pace the lessons.

Technical Notes: The programming was done in Cando on an Amigo. A Video Toaster was used for the video switching and effects. The 3D graphics were pre-rendered in Swivel 3D. The video was pressed on to a laserdisc and controlled by the Amigo. A box that I built decoded the touch-tones.

Being There with the Melons

See Video

This was a drama of an American couple metamorphosed into melons by their television set. Just as they become comfortable in their new form a coffee maker left on threatens their existence. The viewer could look about the room using their touchtone phone as the drama unfolded. It was a linear narrative with an enhanced viewpoint.

Technical Notes: This was HyperCard overlaid on a videodisc. Blackboxs DTMF to ASCII converter decoded the touch-tones. I used the camera rig that I put together for Apple to capture the scene.

Mayor For A Minute

 SEE VIDEO

This was a pie chart where each viewer could shift the consensus of how conflicting priorities should be resolved. A computer graphic pie chart filled the screen. Each slice represented a portion of the city. A viewer could then pick a slice and reallocate resources from one slice to another. While the viewer worked on a slice video bytes advocating spending in that area showed through the slice. The idea was to have this video sent in by community groups on a daily basis forming a computer-based television network. After a while, a video face would come out of the pie and editorialize on the reallocation. NYU and Apple Computer cosponsored the project. A prototype was demonstrated at SIGCHI 92.

Technical Notes: This used a videodisc and later QuickTime controlled by HyperCard. The touch-tones were decoded by a BlackBox DTMF to ASCII converter.

Dan’s Apartment

SEE VIDEO

An interactive television show that allowed viewers to virtually navigate through my apartment by speaking commands into their phone. This began as an experiment looking for better interface metaphors than the desktop. By putting it on television, it took on a strange voyeuristic quality. It developed a large cable following and provided me with my 15 minutes of worldwide attention. I did this work as a student at ITP, NYU.

Technical Notes: I shot all pathways through my apartment onto video and pressed them to a laserdisc. I built a box using a Voice Recognition chip from Radio Shack and connected it via the parallel port of an Amiga to AmigaVision software that controlled the graphics and the laserdisc.