More Pixels, More Problems
Aug 1, 2006 12:00 PM, By Jeff Sauer
The pursuit of higher-res images makes attention to detail all the more necessary.
Why would anyone complain about the trend toward higher-resolution displays, or even high-definition video, in general? After all, what's not to like about higher-quality moving images shot by higher-resolution cameras and shown on higher-resolution monitors? The simplicity of it all is a marketer's dream. Higher resolution numbers mean a better picture. That's something consumers can understand, so it should be an easy sell. After all, the math doesn't lie, right?
Well, it's not quite that simple.
Naturally, high-definition video can be lovely as it delivers “I'm never going back” pictures when compared to standard definition. But, although high-resolution displays are an indispensable part of this equation, high resolution by itself isn't a panacea and can sometimes actually make pictures look worse. Between poor scaling, compression, and other digital artifacts; de-interlacing artifacts; and old-fashioned analog noise, higher-resolution displays can expose a lot of flaws and leave viewers unsatisfied.
The most commonly quipped caveat regarding increased detail is from the early days of HD, when on-air newscasters realized that HD was exposing more of their facial imperfections and aging lines. Background sets, once consisting of faux bookshelves and haphazard construction, required a cosmetic makeover, as well. Yet, aside from the potentially fragile egos of TV personalities, those weren't bad problems, because they ultimately showed HD delivering on the promise of lifelike images. Still, those issues exemplify how higher resolution comes with the need to pay closer attention to detail, including processing images smartly.
The bigger problems with high-resolution displays come when flaws and artifacts are a result of the underlying technology itself, and that is happening more often as high-resolution displays reach wider audiences.
HIGH RES AND SOFTNESS
There is little question that HD is a big step up in picture quality over standard definition. In the camera, HD often looks flawless (technical quibbling aside), but that first-generation image is rarely what gets onto the display. Digital-to-analog conversions along the way can introduce artifacts, as can transcoding cycles from one compression format or bit rate to another.
Those types of errors, especially when starting from a clean source, can be hard to see, even at a higher display resolution. However, if you add multiple analog-to-digital and digital-to-analog conversions along the way, or multiple transcodes going from camera to editing system to distribution — especially if that means lowering the compression bit rate for cable or satellite distribution — more artifacts inevitably become visible.
Even more troubling is that the displays themselves often add artifacts. Every digital display, of course, has one native resolution, whether that's 1280×768, 1920×1200, or something else. The display therefore looks best when it's fed that exact resolution. HD video, however, does not have just one resolution, and modern displays have to be able to show them all, as well as being able to make standard-definition material look presentable. HD is typically shot at either 1280×720 or 1920×1080, but it can be either interlaced or progressive, and it can be at a variety of different frame rates, from 24 up to 60 frames (or fields) per second.
Coming from a computer-industry heritage, many high-definition displays have resolutions like 1280×768, 1366×768, or 1920×1280 that correspond to the 4:3, “square-pixel” resolutions of computers. Scaling from one resolution to another isn't too bad when multiplying by two, or even 1.5. However, when you start trying to scale by numbers like 1.185, the math gets a little tricky. This often causes scaling errors.
Awkwardly, instead of addressing these problems head on, the drive to lower manufacturing costs often has display makers skimping on image-processing or interface components. It's quite common nowadays, for example, for manufacturers to include better-quality components for the higher-quality component inputs but cut corners on the S-Video and composite. That means the pictures from an S-Video camcorder input might look significantly worse than that of a component DVD player.
SILICON OPTIX REON-VX
The trend toward those consumer-luring higher resolutions demands better quality image processing to help mask some of the artifacts and noise that will inevitably creep into moving pictures by the time they get to a display. Dedicated image-processing hardware, and there are already a few different solutions on the market, is likely to become increasingly important.
One recent solution is Silicon Optix's new Reon-VX chip. Reon is the younger sibling of the Silicon Optix Realta chip, which boasts “Hollywood-quality video” and the “HQV” marketing acronym. It's marketing hyperbole, perhaps, but when Silicon Optix acquired Teranex, a recognized high-end image-processing-technology leader in Hollywood circles, it gained significant intellectual property for making digital images look as good as possible.
Realta, now some two years old, put much of that expertise onto a chip that could be integrated into DVD players, standalone scalers, and displays themselves. Realta has been well-received by device makers and image-quality critics. However, it's a premium product compared to image-processing chips from companies like Pixelworks, Gennum, and Faroudja, and a higher cost has ultimately limited its broad appeal among consumer-oriented products.
Reon, on the other hand, is designed to compete more directly with the cost leaders, while still, according to Silicon Optix, delivering the Hollywood-quality video processing of Realta. Indeed, Reon boasts most of the same features, including four-field-per-pixel de-interlacing for maximum sharpness, film cadence processing, diagonal filters to help remove aliasing and stair-step artifacts from de-interlaced video sources, detail enhancement or sharpening to counter the softening of noise and digital artifacts, and color detection and correction, all using 4:4:4, 10-bit image processing. There's even the geometric-distortion processing of Silicon Optix's older Image AnyPlace engine.
The primary difference between Realta and Reon is the programmability of Realta that enables custom control for specific or high-quality applications or hardware. The lack of that programmability affords a smaller and less expensive chip that is ultimately more appropriate for general-purpose display products, without losing functionality that would probably not be used in consumer devices anyway.
Adding any cost to the build of materials for consumer products is a difficult proposition. Worse, American consumers aren't necessarily known for their appreciation of good picture quality. However, with higher-resolution displays, that seems to be changing — not so much because they can look so great with the right source material, but because the extra pixels clearly illustrate a contrast between high-resolution programming and all that is bad, or unclear, in the imagery from less high-quality programming.
Ultimately, the added cost of quality image processing is probably going to be a lot less than the cost of returns by unhappy customers — as long as the added up-front costs don't price the display out of the market to begin with, that is. At least that's the balance that Silicon Optix is trying to find with the Reon.
Acceptable Use Policy blog comments powered by Disqus