Gotta Have The Pipes
I've been working on a multi-part training program that covers, the ?A to Z? of video signal interfacing and distribution. Sounds pretty mundane, doesn't it? After all, we simply select the appropriate interfaces, cable them up, turn on the power, and move on, right?
I'VE BEEN working on a multi-part training program that covers, the “A to Z” of video signal interfacing and distribution. Sounds pretty mundane, doesn't it? After all, we simply select the appropriate interfaces, cable them up, turn on the power, and move on, right?
That may have been true a decade ago, when the operative resolutions for a majority of video sources and computer displays were capped out at 800x600 or 1024x768 pixels. But we live in a different world now, one where we're pushing high-resolution computer and video through analog and digital interfaces with greater frequency.
Our displays have gotten better at showing all of this detail, too. In 1995, CRT projectors and monitors still ruled the roost for professional and consumer display applications. Looking back at the 1995 InfoComm Projection Shoot-Out program, I found that of the 80 total entries in all projection categories, 43 of them used either projection tubes or direct-view picture tubes.
In those days, the highest-resolution category was “Graphics,” which generally meant 1024x768 (XGA) resolution. XGA stands for eXtended Graphics Array, and in a day where the first 640x480 front LCD projectors were just coming to market, 1024x768 was considered a lot of pixels.
Was bandwidth an issue then? Not really, given that all of the displays entered in the Graphics category used CRT technology. Three of the front projectors were equipped with 7-inch tubes, typically good for 480p and 600p imaging, while the remaining four employed 9-inch tubes, capable of handling only SXGA (1280x1024).
So, the limiting factor in the display system was the resolving power of the device, not the bandwidth. Pumping a 1024x768 signal with a refresh rate of 60 Hz through an interface required 70 MHz of system bandwidth, but even if the bandwidth rolled off at 60 or 50 MHz, the limited resolution of a 7-inch CRT meant you wouldn't notice the lost data anyway.
Fast-forward 10 years, and it seems that most laptop and desktop computers have at least 1024x768 resolution. Sometimes it's 1400x1050 (SXGA+), or it might be a widescreen resolution such as 1280x768 or 1366x768. And on the TV set, we now have 1280x720 and 1920x1080 HDTV coming through the antenna or cable jack.
What's more, CRTs are slowly disappearing, being replaced by LCD, DLP, plasma, and other fixed-pixel technologies. So we no longer have excuses for a “clogged” pipe — one that rolls off high-frequency information before it gets to our new high-resolution displays.
While 70 MHz may have seemed exotic at one time, today it's probably the bare minimum bandwidth specification we'd want to have. Consider the bandwidth requirements for 720p HDTV (83 MHz), SXGA+ @ 60 Hz (132 MHz), WXGA @ 72 Hz (94 MHz), and 1080i (93 MHz), and you can see where we might have a bit of a problem.
Oddly enough, not all display manufacturers seem to be focused on bandwidth issues. On more than one occasion, I've tested expensive home theater front projectors and plasma monitors that can't pass enough detail in a 1280x720 HDTV signal, yet use fixed-pixel arrays that equal or slightly exceed that resolution. The rationale given by a marketing manager for one of these companies was that the HD analog component inputs on a particular projector (which retailed for more than $10,000) were primarily intended for use with DVDs (480p resolution), based on actual customer use. Additional signal enhancement circuitry used to enhance edges of 480i and 480p video also degraded the quality of HD images connected to the HD analog component inputs.