Gotta Have The Pipes
I've been working on a multi-part training program that covers, the ?A to Z? of video signal interfacing and distribution. Sounds pretty mundane, doesn't it? After all, we simply select the appropriate interfaces, cable them up, turn on the power, and move on, right?
Instead, I was advised to use the DVI digital inputs to achieve higher bandwidth. Running some test signals through that connection revealed that indeed the projector was a much better performer with 720p and 1080i. However, that wouldn't help anyone with an older set-top box equipped solely with component video jacks.
I don't review projectors, monitors, and TVs as much as I used to, but when I do, the first step is to sweep the display's component video and RGB inputs for frequency response using a multiburst pattern generator. While viewing a 720p luminance multiburst pattern, which contains data out to 37.5 MHz, I inevitably find the 37.5 “burst” to be a mass of solid gray, instead of finely-delineated black and white lines. You might be surprised at how many expensive projectors and monitors can't handle the next-lowest frequency, 18.5 MHz, let alone the 37.5 MHz multiburst. Yet these displays are touted as “high-definition,” proving that pixel count alone does not an HDTV make.
As LCD and plasma manufacturers migrate away from standard definition to higher resolutions (1024x768, 1280x768, and up), bandwidth will become even more of an issue. And the new 1080p projectors, monitors, and TVs coming to market will only magnify the problem of clipped bandwidth. Toss in larger screen sizes, and you can see a lot of puzzled viewers wondering where all the picture detail went.
Hand-in-hand with the move to higher resolutions and widescreens is a trend toward digital interfaces such as DVI and HDMI. Both of these “pipes” can easily handle the flow; single-link DVI has a bandwidth of 165 MHz and dual-link is good for double that. HDMI goes even higher, promising bandwidth into the GHz range.
As long as every signal processing circuit before and after these interfaces can match those figures, things will be fine. But they rarely do. Digital TV set-top boxes often have clipped bandwidth, and if some sort of format conversion is used to match a TV's or monitor's scan rate limits, then image detail is also being tossed aside. An example would be format-converting 720p to 1080i to accommodate the 33.8 kHz horizontal scan rate commonly used in CRT direct-view HDTVs. The resulting 1080i signal itself is often converted to 540p by grabbing every other field of video, a process that saves money but compromises image quality. If the set-top box doesn't have enough bandwidth to first process the 720p signal before converting it, you'll notice reduced image detail and possibly scaling and scan-conversion artifacts.
With all the talk about 1080p displays, it might be a useful reality check to calculate the required system bandwidth to pass a 1080p/60 signal, should one ever come into existence as a transmission and distribution standard. 1920x1080 = 2,073,600 pixels; X 60 Hz = 12,441,600. Divide by 2 (6,220,800), then multiply by 3 (for a 3 dB bandwidth specification) = 186,624,000 or 186.6 MHz.
Realistically, that means a true, flat 200 MHz bandwidth for a 1080p system. In turn, that means using better components in a display, which raises its cost. In a world where projector, TV, and monitor manufacturers are trying to hack their prices as much as possible to maintain market share, I'd bet that conserving video signal bandwidth is probably the farthest thing from their minds….
Pete Putman is a contributing editor for Pro AV and president of ROAM Consulting, Doylestown, PA. Especially well known for the product testing/development services he provides manufacturers of projectors, monitors, integrated TVs, and display interfaces, he has also authored hundreds of technical articles, reviews, and columns for industry trade and consumer magazines over the last two decades. You can reach him at firstname.lastname@example.org.