J
J. Clarke
Captin said:I’m asking everyone here. Does the performance of DVI vary a great
deal from video card to video card?
I mean is it possible we have a situation where DVI offers a step
forward with some video cards and not so with others?
It's not a simple question.
First, the result depends on the monitor. In general, with LCD displays,
using the same board, same monitor, same cable, same everything, DVI will
yield an image that is any where from imperceptibly to greatly superior to
that yielded by analog. With CRT displays, in general DVI yields inferior
results to analog simply because the DAC used in CRTs with DVI is usually
not of very high quality.
Then it depends on the resolution--the resolution limits for DVI are lower
than for analog. If the monitor and video board can support a higher
resoltion with analog than DVI allows, then in general analog can give a
better image by using that high resolution.
Then it depends on the configuration--if your monitor is one which does not
scale the DVI input then you will always have a 1:1 correspondence between
physical and logical pixels and you won't get sharper than that. If it
_does_ scale however and if the signal sent out by the video board is
different from the native resolution of the monitor then the image will be
degraded to some exent. Whether the result will be better or worse than
that yielded by analog at the same resolution then depends on the details
of the implementation of scaling in the monitor.