Emulating Television - Streaming Video

Topics:
Overview
Streaming Overview
Network Architectures
Streaming Protocols
QuickTime Streaming
Flash Streaming
Codecs
Other Players
Other Servers

Overview

As mentioned previously, putting live video on the internet became somewhat easy with CU-SeeMe in the mid 1990's. Shortly there-after, RealNetworks, Microsoft and Apple hopped on the bandwagon and created technology that allowed it to take place in the browser (as a plugin).

During this period of time (mid to late 1990's) a lot of attempts were made to popularize the practice but one of the major limiting factors was the penetration of broadband. Video streaming typically needs higher bandwidth than a modem can handle to present some semblance of a satifying experience (unlike audio streaming which can work well over a modem).

One such experiment was in the form of a company founded by Josh Harris (Founder of Jupiter Research) called Pseudo.com. A recent documentary about Josh, Psuedo and his subsequent project We Live in Public may give a good history of the time and the people involved.
On the Media Segment

Examples

Television: wwiTV, TVChannelsFree, Bloomberg
User Generated: UStream, Justin.tv, Livestream

Streaming Overview


Streaming generally consists of an audio and/or video source fed into an encoder, transmitted to a server which then feeds it out to clients. There of course are variations on this such as streaming text where the source is not audio or video and multicast where a server may not be needed but this is the general model.


For more information: Streaming media - Wikipedia

Streaming Network Architectures


Multicast
Multicast is great in theory but in practice isn't very practical. It allows for the serving of a single stream that is replicated throughout the network and any client that chooses may "tune in". The best part is that a server need not have a large amount of bandwidth to support a large number of clients (as opposed to the unicast model). Multicast is also great for keeping clients in synch with each other. Multicast is meant for live media and is not suited for content on-demand. The biggest problem is that the network can not support a great number of multicast sources and often network administrators will not enable multicast on their networks due to bandwidth and security concerns.

For more information visit: Multicast - Wikipedia
MBone is the Multicast Backbone on the Internet that is now considered obsolete. It was a network that connected multicast enabled networks on the internet.
Internet2 is a network open to research institutions that allows for much higher bandwidth and often has generally available multicast streams.


Unicast
Unicast offers a one-to-one relationship between the server and the clients receiving the stream. This is the model most in use today but causes problems when content is popular and the media server does not have enough bandwidth to support all of the client connections. Unicast is well suited for delivery of media on-demand but can have issues when synchronization of live content is a concern.

For more information visit:
Unicast Streaming


Content Delivery Networks (CDN)
Content Delivery Networks grew out of the need for a better quality of service for end to end streaming that could be achieved with unicast streaming alone. CDNs are generally commercial companies that have developed a network of proxy (for live streams) and caching (for on-demand media) servers. If you utilize a CDN for live streaming you will send your live stream to a server at the CDN which will then intelligently route your stream to their proxy servers based on demand. Utilizing a CDN for on-demand content requires some means to flag content to be cached on the the CDN network. Based upon popularity the CDN will replicate your content on their caching servers which then serve the content to clients. Generally CDN's have proxy/caching servers at major ISPs to be as close as possible to individual clients bypassing the major points of congestion on the internet. In a sense, CDNs are a hybrid of multicast and unicast based networks.

For more information visit:
Akamai
Content Delivery Network - Wikipedia
Coral Content Delivery Network

Streaming Protocols


TCP - Transmission Control Protocol
Not great for streaming audio and video as it does not accept packet loss or delay and forces server to resend packets. It is more important for live streaming audio and video to be in "real time" than to continuously fall behind trying get each and every bit through the network. Using TCP is sometimes unavoidable such as when firewalls are involved or other transmission methods are considered insecure.

HTTP - Hypertext Transfer Protocol
Meant for delivery of web pages and associated data (images, text and so on). HTTP utilizes TCP for delivery of information and adds a set of request methods specific to web requests (such as when a form is posted or submitted). Never the less, some streaming servers (particularly MP3 audio streams) use HTTP to deliver their media. The biggest benefit is that it works well even with the most aggressive firewall setups.

UDP - User Datagram Packet
Much better for delivery of live streaming audio and video. It does not have the delivery requirements of TCP and therefore packets can be delayed or damaged without causing the server to resend. Doing such allows for more efficient transfer of data over unstable networks and allows clients to stay in synch with each other while watching live streams. The biggest drawback to using UDP is that many network administrators feel that it is insecure often block UDP data on their networks.

RTP - RealTime Transport Protocol
Utilizes UDP as a transport mechanism but adds control connection for communication between client and server primarily for quality of service information.

RTSP - RealTime Streaming Protocol
A control protocol for streaming, doesn't actually deliver the data but acts as the means to setup the transport and control the flow. Utilizes the application or user selected data transport method (RTP, UDP, TCP, HTTP). Offers "VCR" capabilities for streaming media (pause, seek, etc..).

RTMP - Real Time Messaging Protocol
RTMP is Adobe's protocol for streaming audio, video and other data over the internet between a Flash player and a Flash Media Server. It utilizes TCP to transmit the data.

For more information visit:
Wikipedia - TCP/IP

Streaming with QuickTime

Serving Streaming QuickTime can be done with both the QuickTime Streaming Server (commercial, Mac only) or the Darwin Streaming Server (open source, Mac, Linux and Windows). They are very similar products varying only in some extra options and user interface components.

We have a QuickTime Streaming Server installed and information about it can be found on our wiki.

Encoding/Broadcasting There are a myriad of encoders that can be used for both streaming and compression with QuickTime. For now we will focus on QuickTime Broadcaster. Unfortunately, QuickTime Broadcaster while free is a Mac only program. We have it installed on the streaming kit available in the ER if you want to check it out.

Using QuickTime Broadcaster Once you have gone through the audio, video and network settings and broadcasting seems to be working you then have to go through the following steps in order to share the live video.

First you have to create a Reference Movie. Reference movies are generally movie files that only contain a link to the live stream. The easiest way to create a reference movie is to open the live stream URL in QuickTime Pro (the URL should take the form: rtsp://hostname:portnumber/stream.sdp) and to Save As a Reference Movie. This reference movie can then be uploaded to a standard webserver and used in an embedded player or as a direct link.
For a live stream, the server sends an SDP (Session Description Protocol) to the player which then starts requesting the live stream. This is a standard way of doing things in the MPEG-4/RTSP world.

For more information:
QuickTime Broadcaster
Session Description Protocol RFC
QuickTime - Delivering Live Streaming Media


QuickTime Pro
The QuickTime swiss army knife. Great for compressing for on-demand files and translating files for use on or off a QuickTime Streaming Server. For use on a streaming server, you need to export as a hinted file. For use on a web page it is a good idea to use fast start so that the file can be played while downloading.


Embedding
Embedding a live stream on a page once you have the reference movie is the same as embedding any other QuickTime content. You use the following tags changing the "src" to be the reference movie:
  <OBJECT CLASSID="clsid:02BF25D5-8C17-4B23-BC80-D3488ABDDC6B" WIDTH="320" HEIGHT="240" CODEBASE="http://www.apple.com/qtactivex/qtplugin.cab">
	  <PARAM name="SRC" VALUE="http://server.name/reference_movie.mov" />
	  <PARAM name="AUTOPLAY" VALUE="true" />
	  <PARAM name="CONTROLLER" VALUE="true" />
  <EMBED SRC="http://server.name/reference_movie.mov" WIDTH="320" HEIGHT="240" AUTOPLAY="true" CONTROLLER="true" PLUGINSPAGE="http://www.apple.com/quicktime/download/">
  </EMBED>
  </OBJECT>		
		
(Be careful of line breaks, only new tags should be on new lines.)

Another way of embedding or Apple's way.

Streaming with Flash

Flash Streaming has come a long way recently. It supports the (almost) standard H.264 codec which in combination with it's near ubiquitous installed base makes it a great choice for live streaming. Unfortunately, Flash doesn't do everything in standard way, it has introduced it's own transport protocol (instead of using RTSP) and therefore has it's own Flash Media Server to do live streams.

We have a Flash Streaming Server setup and information about it can be found on our wiki.

Encoding/Broadcasting
One of the great things about streaming with Flash is that you don't nessecarily need a dedicated piece of software to encode/broadcast (although to get the highest quality with the latest codecs you do, Flash Media Live Encoder (windows only)).

Here is a very straight forward encoder/broadcaster built in flash that works through a browser: Flash Broadcaster. It is built off of this example: Building a Broadcaster and a Receiver in Flash (with the Flash Communication Server). The settings you need can be found on our wiki.

Player To receive a live stream you need a Flash video player (built in Flash). Here is a quick example that you can pass in the address and name of the stream into: Flash Video Player

Download the Flash Project for the Player



Flash Media Encoder
Adobe also has a piece of software called Flash Media Encoder which can stream live to the Flash Media Server. Unfortunately it is Windows only so we won't be going over it in class but if you have a Windows machine, give it a shot (you will get the highest quality)..

Further Reference:
Building a Broadcaster and a Receiver in Flash (with the Flash Communication Server).
Encoding Best Practices for Live Video
Flash Video | Optimizations and Tools
The Leonard Lopate Show: Why Videocommunication Didn't Catch On - Jaron Lanier -July 25th 2006

Codecs


Codec stands for Compress/Decompress. It is the means to deliver or store high quality audio and video in less space and using less bandwidth than would normally be required.

Video Capture and Compression Theory
A frame of video, essentially an image, at a resolution of 640 pixels by 480 pixels with 24 bits per pixel (RGB, 255 values of Red, 255 values of Blue and 255 values of Green) would occupy 7,372,800 bits. At 15 frames per second we are looking at 110,592,000 bits per second (or 13,824,000 bytes per second or 13,824 kilobytes per second or 13.824 megabytes per second). This is waaaaaaaaaaaaaaay more than the average internet connection. Broadband connections (DSL and Cable Modems) typically allow for 1mbps (megabit per second), which more than 100 times fewer bits per second than uncompressed video. Hence the need for compression.

Video codecs typically employ some combination still frame encoding for key frames (representing a full frame) and delta frames (current difference from the previous key frame). Specific mathematical algorithms that video codecs employ are: Discrete Cosine Transforms (DCT), Discrete Wavelet Transforms (DWT), Fractals and various combinations of them all.

Audio Capture Theory
Audio is captured by sampling the amplitude of incoming sound waves (amplitude (level) and frequency (time)). The sampling rate is the number of of these samples taken per second, the higher the sampling rate the more accurate the representation of the wave. Sampling rate is expressed in kHz (kilohertz) which is 1000's of samples per second. The other factor of audio capture is the number of bits per sample (quantization), 8 bit samples have 256 steps to represent a given amplitude whereas 16 bits give 65,536 steps per representation.

The Nyquist Theorem states that an sampling rates must be at least 2 times the highest frequency to represent a wave. The higher the sampling rate the higher the frequency that can be sampled and represented accurately. This is why music utilizes a much higher sampling rate than voice, it utilizes a much wider range of frequencies.

A sampling rate of 22.05 kHz or 22,050 samples per second at 16 bits per sample yields 352,800 bits per second (or 352.8 kilobits (kb) or 44.1 kilobytes (KB)). This is much more feasible for transmission via broadband connections but could still use some compression.

Specific Codecs
There is no "Best" codec... Each codec has it's plusses and minuses and each is good for its particular use.

H.261
Early video conferencing codec (1990) - Video conferencing requires fast encode/decode for low latency.

H.263
More recent video conferencing codec (1996) Used in earlier versions of Flash (MX and version 7).

MJPEG - Motion JPEG
Individual frames of video encoded as JPEG. Good for analog capture for non-linear video editing, every frame is in essence a keyframe and cuts can be made anywhere..

MPEG-1
Targeted for CDROM speeds (1.5 mbps)

MP3
Audio layer of MPEG-1, MPEG-1 Layer 3.

MPEG-2
High bitrate codec used for digital cable, broadcast and satellite as well as DVD. High royalty rates as well.

MPEG-4
Range of bitrates, developed for a large variety of uses including multimedia over unreliable networks such as the internet and mobile phones. Includes components for interactivity and many different audio and video profiles.

H.264 - also known as MPEG-4 Part 10 or AVC (Advanced Video Coding)
Same as MPEG-4 but a newer "profile" and much more scalable than previous MPEG-4 profiles have been. The next big thing.

AAC - Advanced Audio Coding
Audio portion of MPEG-4

DV
Tape format and "light" compression codec. Great for digital video capture and editing. High bitrate.


Other Proprietary codecs worth mentioning
Sorenson Video 3
DivX
3ivx - MPEG-4
RealVideo 10
Microsoft Windows Media 8 (ASF)
Microsoft MPEG-4 v2 (Not really MPEG-4, Encodes to AVI, DivX stole it)
Q-Design Music
On2's VP 6 Codec - Used in Flash 8

Open Source Codecs worth mentioning
Ogg Vorbis and Ogg Theora
Dirac Originally developed by the BBC
XviD
On2's VP3 (now part of the Ogg project)

For more information:
Codec Comparison
QuickTime Playback Compatibility Chart
H.264 Begins its Ascent
Flash Video | Optimizations and Tools
Live TV is dead, and we're noticing the smell

Players

QuickTime
Windows Media
RealPlayer
Flash Player
Several miscellaneous MP3 Players (stemming from WinAmp by Nullsoft
MPlayer
VLC


Servers

QuickTime/Darwin Streaming Server
Windows Media Server
RealServer/Helix
Flash Media Server
Miscellaneous MP3 Servers (stemming from Shoutcast by Nullsoft (now owned by AOL))