Rod Speed said the following on 1/1/2007 1:16 PM:
Digital doesnt get degraded by the cable etc.
Nope, anything where you want to maintain image quality.
The degradation happens most at the higher bandwidths,
so at the higher screen resolutions.
Rod, I just googled DVI to try to understand the DVI-I and DVI-D, single
and dual.
The present issue is that we have a monitor that, I am told, has some
blue lines randomly appearing when running non-intensive apps but
mega-problems when my daughter runs AutoCad. (She is about 250 miles
away). She swapped monitors, cable, etc., and the issue is with her
monitor on the analog circuit (the monitor is just fine on her
roommate's computer with either one's VGA cable.
Her monitor has the VGA adapter and a DVI-I dual link receptacle. Her
video card has VGA and DVI-I receptacles. As I understand it, the DVI-I
allows for both VGA and digital signals to be passed to the monitor.
Correct? I do not understand monitors well enough to know if there are
two VGA and one digital circuits. My thought is this: if we can bypass
the VGA circuit by purchasing a DVI cable, then, assuming there is no
digital circuit problem, that would solve her present problem. If that
is true, then I am unclear whether to use a DVI-I or DVI-D cable and
whether to use a single or dual link cable. So that adds a second
question: when does one use a single versus a dual link cable? Clearly
only one monitor connector can be hooked into one DVI-D connector.
This is a bit confusing. I hope I am able to communicate my lack of
understanding well enough!!
Thanks
Ken K