help please. DVI - VGA adapters

  • Thread starter Thread starter purmar
  • Start date Start date
P

purmar

Hello,

I got computer with video card ATI Radeon 9200 (128 MB). I was going to
connect my two monitors, both with VGA plugs. At the time I did not know
difference between DVI-D and DVI-I so I thought that I would simply buy a
DVI-VGA adapter for $5 and later, when I would buy LCD monitor, I would
simply unplug the adapter and would be good to go.

Well, I have all the things, computer is working, but only on one monitor.
I got adapter that connects DVI-I to VGA. But my video card has connector
for DVI-D!!!! Please, somebody help me. I cannot find DVI-D to VGA
anywhere. What should I do. Does it exists? Should I add another video
card? Replace the one that I have?

Any idea will be very appreciated.

Thanks

Purmar
 
It's been my experience that the ATI cards with DVI-D connector (8500LE) did
so since they only had one RAMDAC and cannot support dual monitors. All
higher end cards have DVI-I with analog output and support dual monitors
just fine.

You got screwed buying a low end card.

-Kent
 
[...]
Well, I have all the things, computer is working, but only on one monitor.
I got adapter that connects DVI-I to VGA. But my video card has connector
for DVI-D!!!! Please, somebody help me. I cannot find DVI-D to VGA
anywhere. What should I do. Does it exists? Should I add another video
card? Replace the one that I have?

DVI-D = DVI-Digital. There is no analog signal in a DVI-D interface, so a
DVI-D to VGA adapter is not possible, I'm afraid. You can only use an LCD
wth a digital input.

Tony
 
Tony A. said:
[...]
Well, I have all the things, computer is working, but only on one monitor.
I got adapter that connects DVI-I to VGA. But my video card has connector
for DVI-D!!!! Please, somebody help me. I cannot find DVI-D to VGA
anywhere. What should I do. Does it exists? Should I add another video
card? Replace the one that I have?

DVI-D = DVI-Digital. There is no analog signal in a DVI-D interface, so a
DVI-D to VGA adapter is not possible, I'm afraid. You can only use an LCD
wth a digital input.

Which seems ironic as most LCD panels will convert the signal back to
analogue again anyway ;-)

You could, in theory, get a converter that consists of DACs to change it to
analogue. Not sure if that exists, but would probably be more expensive
than the video card!

Scott
 
[...]
Which seems ironic as most LCD panels will convert the signal back to
analogue again anyway ;-)
It's the other way round - LCDs are fundamentally digital, but most take an
analogue signal and convert it to digital, purely for compatibility with the
majority of vid cards which only have analogue output. DVI-D is very
sensible, getting rid of the whole digital-analogue-digital conversion, it's
a shame analogue VGA is so standard that it hasn't make much of an impact
yet.

Tony
 
Tony A. said:
[...] so
a

Which seems ironic as most LCD panels will convert the signal back to
analogue again anyway ;-)
It's the other way round - LCDs are fundamentally digital, but most take an
analogue signal and convert it to digital, purely for compatibility with the
majority of vid cards which only have analogue output.

Believe me, the digital signal you give to your TFT LCD goes straight into a
DAC to produce an analogue voltage to drive the pixels at different
brightnesses. The analogue signal does not need to go through this DAC
(depending on design of the panel, some convert it to digital, then back to
analogue again to do the filtering and improve timing/sync).
DVI-D is very
sensible, getting rid of the whole digital-analogue-digital conversion, it's
a shame analogue VGA is so standard that it hasn't make much of an impact
yet.

The point of DVI-D is that it gets rid of all the timing problems that you
normally need to set with monitors. Because the LCD panel has a digital
input, it can generate it's own analogue signal that will exactly correspond
to each pixel, whereas if you have an analogue input it just has to guess
where each pixel is (in time) on the video signal. Hence you need to set
the width, height and offsets (mostly done automatically though) with
analogue input, but no need with digital.

DVI-D is better because the DAC is in the monitor, not the graphics card.
In most cases there is just one digital to analogue conversion, and that
either happens in your graphics card or in the monitor.

Scott
 
[...]
Believe me, the digital signal you give to your TFT LCD goes straight into a
DAC to produce an analogue voltage to drive the pixels at different

You're kind of right, but it doesn't quite go "straight into a DAC", that's
somewhat simplistic, see below.
brightnesses. The analogue signal does not need to go through this DAC
(depending on design of the panel, some convert it to digital, then back to
analogue again to do the filtering and improve timing/sync).

It's not quite as simple as just applying a voltage to the pixel
porportional to the brightness you want. For example liquid crystal
deteriorates due to dc stress, so each pixel needs to be alternately driven
positively then negatively on successive writes to minimise the net dc
stress. Your analogue VGA signal doesn't do that, that's one reason you
can't just apply your analogue input, or a simple derivation of it, directly
to the LCD pixels.

For that and other reasons, I'd be willing to be bet that there aren't any
bare LCD panels that have analogue pixel voltage inputs, though as ever
ICBR.

Tony
 
Tony A. said:
[...]
Believe me, the digital signal you give to your TFT LCD goes straight
into
a
DAC to produce an analogue voltage to drive the pixels at different

You're kind of right, but it doesn't quite go "straight into a DAC", that's
somewhat simplistic, see below.

OK, so it goes through a look-up table first, but then pretty much straight
into the DAC ;-)
It's not quite as simple as just applying a voltage to the pixel
porportional to the brightness you want. For example liquid crystal
deteriorates due to dc stress, so each pixel needs to be alternately driven
positively then negatively on successive writes to minimise the net dc
stress.

Normally the voltage on the other side of the LC to that of the signal is
altered each frame to help with that.
For that and other reasons, I'd be willing to be bet that there aren't any
bare LCD panels that have analogue pixel voltage inputs, though as ever
ICBR.

Yeah I see what you mean, I guess it's probably easier for the brightness
and contrast to alter the signal digitally anyway!

Scott
 
Can't help myself from butting in here, but this card is a good mid level
card and I have 2 of them. Both of mine have an adapter to connect to D-sub
connection and it gives VGA output to be used w/analog LCD monitor or a
digital LCD. Why could you not hook it to an analog monitor?
 
Can't help myself from butting in here, but this card is a good mid level
card and I have 2 of them. Both of mine have an adapter to connect to D-sub
connection and it gives VGA output to be used w/analog LCD monitor or a
digital LCD. Why could you not hook it to an analog monitor?


Because there are two DVI standards. One concurrently outputs an
analog signal out the DVI port (which that little adaptor hooks up
to), the other does not, requiring and external DAC box to use an
analog monitor. The former uses a simple plug adaptor, The OP
apparenty has the latter type.
 
It's been my experience that the ATI cards with DVI-D connector (8500LE) did
so since they only had one RAMDAC and cannot support dual monitors. All
higher end cards have DVI-I with analog output and support dual monitors
just fine.

But the 9200/9000 cards have two RAMDACs on the chip so I don't know
why his card is like that.

BTW, not all 8500LEs had only one RAMDAC. In fact, originally anyway I
think the only one that skimped on the RAMDAC was Hercules or some
other European brand, kind of suprirising IMHO since even the cheaper
Taiwanese cards had two RAMDACs in general.
 
Back
Top