Picture This: What Is HD?
Oct 1, 2004 12:00 PM, By Jeff Sauer
A couple of weeks ago, I got into a discussion with some family members about HDTV, and it was a little discouraging. Admittedly, my family isn't as sophisticated as the average Sound & Video Contractor reader when it comes to HDTV and high-definition video, but I'd like to think that they are reasonably aware, at least relative to the average consumer. But if that is so and if this conversation was any indication, there are a lot of misconceptions out there.
When it comes to HD-ready digital displays and the associated migration from analog to digital television, I'm pretty sure my family's confusion is not unique. I've heard some of the same inaccurate impressions from electronics salespeople, cable TV professionals, and even booth personnel at industry trade shows.
Here's an example of that confusion: “We really ought to just replace our old TV set, but we're thinking we should just wait because we'll have to get a new one again in 2007, when the new HDTV sets come out.”
I've learned that it's apparently a common angle for superstore salespeople to claim that today's analog TVs simply won't work after the magic FCC migration year of 2006. However, those salespeople have missed the mark badly if the potential buyer didn't get the message that he or she could buy a replacement now. On the other hand, I suppose it's just been within the last couple of months that a majority of business and consumer plasmas have reached minimal HD native resolutions, raising the norm from 853 by 480 (wide VGA) to 1280 by 768 or some similar XGA or wide XGA pixel count.
Still, the end of analog broadcast just isn't on the horizon. The FCC's target year for all (actually 85 percent of) U.S. TV stations to be broadcasting digital signals may be 2006, but that is unlikely to happen, even according to FCC Chairman Michael Powell. Even if the stations are ready, a critical mass of consumers with digital-capable television sets won't be ready to watch the programming by then. This current year is the first time that monitors with the necessary built-in ATSC tuners have been available, and no one is predicting mass adoption in two years. At least the professional AV market benefits slightly. Because tuners are still rare and expensive, most business and commercial flat panels generally do not include a TV tuner at all, and that saves money on technology that has yet to reach high-volume pricing.
Isn't HD the same as digital TV?
I'm betting most readers have heard that one. It's one of the most common misconceptions out there. Although all HDTV is DTV, all digital is not high definition. Actually, industry veterans will recall the initial analog HDTV attempts and even modest success in Japan more than a decade ago. And today's most common HD connectivity from source to display is through analog component interfaces, but for all current intents and purposes, HD is digital during distribution. The difference between SD and HD is, of course, resolution. Standard definition digital video (CCIR-601) is 720 by 480 (really 720 by 486), and HD is 1280 by 720 (or 720p progressive scan) or 1920 by 1080 (usually 1080i for interlaced video). Simple math says there's a lot more data in HD, thus the clearer picture.
What's more, the FCC's target of 2006 is for migration to DTV and not to HDTV, though there is a clear implication that the temptation of HD picture quality will help drive the transition. For content owners, the change from distributing analog frequency information to sending digital data over the air and through cable wires offers increased revenue potential. Digital distribution is also far more efficient than analog broadcast because a broadcaster or cable provider can send roughly four times as many digital channels of similar image quality over the same bandwidth. HDTV channels, not surprisingly, use up more bandwidth than standard definition channels but roughly as much as one analog channel.
What's enhanced TV?
It's marketing. With the majority of plasma monitors being 853 by 480 until recently, many vendors started calling their displays enhanced TV, or EDTV, to suggest that a digital picture looks better than an analog picture. That's not inherently true, because DTV is most commonly still a digital representation of an analog original. However, in practice staying digital can eliminate a lot of the noise that typically compromises analog, making digital a potentially clearer, cleaner image.
What's HD-ready or HD-capable?
More marketing, really. It simply means that a monitor has the ability to accept and display an HD input and scale it to the resolution of the screen. It doesn't mean a monitor can display HD natively. Of course, no monitor can display both 1280 by 720 and 1920 by 1080 natively, so the quality of any display often depends on the quality of its scaling.
So if DTV isn't HDTV, what content is really HD?
I'm pretty proud my family asked this question. It shows that they were thinking. There is now significant momentum in Hollywood, and even local television production in some larger markets, to produce programming in HD. It's already yielding more original HD content on broadcast television and a few HD-oriented cable channels, like Discovery HD, HBO HD, Showtime HD, ESPN HD, as well as all the major networks. Yet even those HD channels show a heavy percentage of standard definition content that's been up-converted to HD. That should change over time.
Business, commercial, and industry video production are even less likely to be HD, though that doesn't mean that higher-resolution displays aren't worth the added expense. With few exceptions, higher resolution is going to offer a cleaner, less posterized image, no matter what the source, assuming reasonable scaling. The cost of HD video production should also decrease dramatically during the next year as new HDV format camcorders begin to hit the market from at least a few different manufacturers, thus enabling business and industrial videographers to create native HD content. (For information about HD servers, see the Technology Showcase on p. 66.)
HDV is MPEG-2 compressed HD that can be recorded to a standard Mini DV cassette. That compression, as well as the expected lower quality of prosumer-oriented camcorders and lenses, suggests that HDV will not be of the same quality as professional HD, but it will be native 1080i or 720p. The other reason high-resolution monitors make sense is that computer inputs are more common, and the higher resolution almost always translates to sharper text and graphics. That's categorically true with one-to-one pixel matching, when the source resolution matches the native resolution of the display.
How will we be able to record HD?
Another great question, but not an easy one to answer. D-VHS, a digital tape and VCR/VTR format, has been the only show in town for home-theater buffs, but it's unlikely any tape-based format will reach the mass market amid the random access norm of today's DVDs and CDs. Unfortunately, there's a huge format war afoot between potential high-definition DVD replacements. Like VHS versus Beta, early DVD format wars, and the still-current DVD-RW versus DVD+RW versus DVD-RAM media war for rewriteable disc formats, consumers are likely to face format confusion for some time to come. So-called “Blu-Ray” and “HD” DVD are now vying for patent rights and potentially enormous future license fees, only this time it's not just the consumer electronics companies — Microsoft is involved, too. That confusion could make hard drive — based Tivo-like HD digital video recorders, or DVRs, the best option for at least the next few years.
How do we connect HD sources and displays?
Truth be told, that is not a question from my family discussion, but it is one that AV professionals face. The most common HD connection today is to the analog component inputs, but it requires a conversion to analog and conversion back to digital in the display that only begets signal noise. Staying digital throughout would make more sense, albeit with a couple of caveats.
First, there needs to be an industry standard, and that seems to be emerging. Although a few manufacturers are still looking toward FireWire, the DVI and its backward-compatible successor, HDMI, seem to be where the industry is going. Unfortunately, many HD-capable displays still don't include DVI as a standard input. Even some that do don't include high-bandwidth digital content protection (HDCP), and that's the second caveat. Because digital data can ordinarily be copied losslessly, piracy is a major issue for large content holders. HDCP should become far more common over the coming months, but it has yet to hit critical mass.
Jeff Sauer writes the “Picture This” column for Sound & Video Contractor and is a contributing writer for Video Systems. He's a video producer, an industry consultant, and director of the Desktop Video Group, a video and computer products testing lab in Cambridge, Massachusetts. He can be reached at email@example.com.
Acceptable Use Policy blog comments powered by Disqus