Bob Niland said:
No question, I wouldn't buy one without it. [DVI-D]
Is there really that much difference?
Yes. Several considerations:
Bob, as much as I hate to disagree with you, I'm afraid
I'd have to vote "maybe" instead. For the most part, the
differences between an analog and a digital interface for
LCD monitors come down to questions of pixel timing,
which really have nothing at all to do with whether the
video information is in digital or analog form.
The main factor in determining how good the displayed
image is going to be with an analog interface is the generation
of the proper clock with which to sample the analog video,
which is a question of both getting the frequency right and
making sure the clock is properly aligned with the incoming
video such that the samples are taken where the "pixels" are
supposed to be. (Being a continuous signal, of course, there
is no information contained within the video itself which
identifies the pixels.) Usually, the clock frequency is obtained
by locking on to the horizontal sync pulese and multiplying THAT
rate up to the assumed pixel rate; getting the alignment correct
(the "phase" adjustment) is a matter of the interface circuitry
making some educated guesses. But if the clock generation can
be done properly, there is very little to be gained by simply having
the pixel information in "digital" form. (And please consider how
truly awful the digital interface would be if the pixel clock information
were removed from it - it would be totally unusable. Hence my
assertion that it is timing, not the encoding of the information, that
is the key difference between these two types of interface.)
There is noise resulting from taking the original
digital raster to analog and back to digital.
This might display, for example as horizontal
artifacts, unstable picture regions, etc.
Nope; all of the above have to do with the timing of
the pixel sampling process, not with noise in the video.
(Oddly enough, the LCD is NOT inherently a "digital"
device as is often assumed - fundamentally, the control
of the pixel brightness in any LCD is an analog process.
Simply having a discrete pixel array does not somehow
make a display "digital," nor does it necessarily mean that
a "digital" interface would have to be better.
Square waves? No chance. Think of a pattern of
alternating white/black 1-pix dots. In analog,
these need to exhibit sharp transitions and flat
tops to emulate what you get for free with DVI-D.
Bandwidth limits in the analog channels are apt
to smear this fine detail.
If we were talking about a display that actually shows
those edges, you'd have a point - but the LCD doesn't
work that way. Remember, we are dealing with a
SAMPLED analog video stream in this case; if the sample
points happen at the right time (which again is a question
of how well the pixel clock is generated), the pixel values
are taken right "in the middle" of the pixel times - making
the transitions completely irrelevant.
Note that "digital" interfaces also have what is in effect a
"bandwidth" limit (the peak pixel rate which can be supported),
and it is in current interfaces often significantly less than what
can be achieved with an "analog" connection. The single-link
TMDS-based interfaces such as DVI (in its single channel
form) and HDMI are both strictly limited to a pixel rate of
165 MHz, while analog connections (even with the lowly
VGA connector) routinely run with pixel rates in excess of
200 MHz.
Group delay with analog introduces some risk that
the pixel data won't exactly precisely align with
the LCD triads upon reconstruction. Suppose the
analog signal has a little group delay (time shift)
from the DAC, or in the cable, or in the ADC (or
just one of the colors does). Our hypothetical white
and black dots might become a gray moire morass.
Right - but again, a timing issue, which gets back to the
question of the generation of the sampling clock, not the
encoding of the data (which is really all that the terms "analog"
and "digital" refer to). Again, take the clock away from
a digital interface, and see what THAT gives you.
So the logical question at this point is why no one has ever
bothered to include better timing information on the analog
interfaces. The answer now is: someone has. VESA released
a new analog interface standard this past year which does just
that - it includes a sampling clock reference, additional information
which helps to properly locate the sampling clock with respect
to the video stream, and even a system which makes the
determination of the white and black levels much more accurate.
This is called, oddly enough, the New Analog Video Interface
standard, or simply NAVI. NAVI is supportable on a standard
VGA connector, but the standard also includes the definition of a
new, higher-performance analog connector (similar to the analog
section of a DVI) for higher bandwidth and other features). It's
not clear yet how well NAVI will be accepted in the industry, but
it IS available if anyone chooses to use it.
Bob M.