A Practical Approach To Live Streaming
How to ensure a successful live streaming event.
In August of 2001, newspaper editors from across the state of Texas were able to ask questions of Gov. Rick Perry while sitting comfortably in their own offices during the 4th Annual Texas Transportation Summit in Irving, TX. Next, they were able to see and hear him personally respond during an interactive webcast of the governor's statewide editorial board meeting, which was held in conjunction with the Transportation Summit. This interactive forum is believed to be the first time a governor used live streaming technology to hold such a meeting. Let's take a look at what went into staging such an event.Pulling it all together
Prior to the Transportation Summit, Dallas, TX-based ViewCast Corp. arranged for access to two of six ISDN circuits at the site for dial-up, multi-stream video uplinks to Seattle-based Activate Corp. An analog telephone line at the site also offered dial-up laptop access to the Internet for receiving questions from remote participants. A pair of ViewCast Niagara portable streaming encoders, each coupled to one of the ISDN circuits via Cisco 804 ISDN routers, was used to encode three simultaneous streams as follows: ISDN-1 (28K audio only and 56K audio and video with video playback in a QCIF-sized display window); ISDN-2 (100K audio and video with a CIF-sized display window).
Dallas Edit/Lone Wolf Productions managed the live production. For image capture, a three-camera setup was used, with one camera locked down to show the governor and moderator. A second camera was mounted in the center of the room on a low tripod, used to show individuals from the local audience. A third, wide-angle, fixed camera was occasionally used for “crowd shots.” All of the equipment was mixed with a Videonics switcher and fed to the two Niagara encoders. ViewCast and Dallas Edit also supplied lighting, lavaliere microphones for the governor and the moderator, cameras, promotional clips, and static content — not to mention all the necessary audio and video mixing, encoding, uplink equipment, and services for the webcast.How it works
This event is a real-world example of the type of opportunities production companies, video companies, and a host of other related firms can offer their customers. Thanks to the evolution of streaming technology, it's becoming easier to install and operate these systems.
Basically, you need a video source, an encoder, a streaming video server, a content delivery network, and a web server (see Fig. 1. on page 54). Installation can be as easy as cabling a camera to the encoding station, which is then cabled to the streaming server. Access to a content delivery network and a web server are gained via the Internet. While most encoding stations are located in a broadcast studio, portable units that weigh less than 10 pounds are now available, enabling anytime, anywhere streaming video.
The video source is typically one or more streams of analog video from cameras, DVD players, or VCRs. These video sources have an analog video connection to the video encoding station. It's common for live broadcasts to connect the cameras to video production and editing equipment, or to an interactive video network, before being passed on to the encoding station.
The encoding station is a computer workstation that captures and encodes the video and audio. These systems must have the power to encode one or more video and audio streams, either in software or via a hardware codec. Individual compressed streams can vary from 20 Kilobits/second (Kb/s) to 500 Kb/s or more. The connection between the encoding station and the video streaming server must be able to accommodate the total of the bandwidths of the individual streams, and must be a clear and reliable connection.
The video streaming server is responsible for delivering compressed video to each individual request for a particular video stream. This is usually handled by one of the commercial streaming media software packages, such as RealNetwork's Helix Server or Microsoft's Windows Media Services. The bandwidth connection to the video streaming server must accommodate the total bandwidth of all the requests for a video stream (unlike the encoding station, which must only accommodate one copy of each).
As a result, the video streaming server usually has a direct connection to a very high bandwidth line. For example, if there were 100 requests for a video stream compressed at 28.8 Kb/s, the server would require at least a 3 Mb/s connection. It's possible for the encoding station and the video streaming server to be one single system. However, this would typically be for a situation with limited performance requirements (e.g. a single input stream and a small number of viewer requests) and would still require a fairly high-performance system.
It's much more common to have two separate systems. To make the streaming process easier, to ensure the best quality video for viewers, and to streamline podcasting, ViewCast recommends using a content delivery network (CDN).