"Gilbert" said:
I have a P2B-DS and currently use an analog monitor. Do the LCDs and cards
with DVI outputs give a sharper, crisper picture/text? If so, what
monitor/video card do you recommend for the P2B?
Thanks.
Gil
That is not an easy question to answer. For comparison purposes,
I recommend visiting any big box stores that sell computers - have
a look at the products on display, to get an idea whether there
is a significant difference or not. You might be repulsed
immediately by what you see. (And if the store is playing
videos on all the screens, have the sales representative put
the Windows desktop on the screen, as you are shopping for
text sharpness after all.)
Generally, your old CRT will use an analog input. I haven't heard
of any with DVI inputs, though I suppose it is possible to do it.
CRTs have a better color gamut and can have true "black" color. On
the down side, a CRT can have jitter added to the image, either
by internal high voltage problems, problems caused by stray
magnetic fields and so on. A CRT can also have problems with
Moire patterns.
A CRT is happy at any resolution up to its max. There is no
interpolation of pixels, so all resolutions are equally useful.
The LCD won't have any jitter in the image, because the pixels
are fixed in place. LCDs can have dodgy non-uniform intensity,
that depends on how the backlighting is done. LCDs can have
dead pixels, although there has been some improvement in that
area over time.
The LCD interpolates, whenever it is run at non-native resolution.
That is something you might want to play with, when you are at
the Best Buy.
LCDs also look weird, when you scroll text documents. It might
take you a while to get used to that. That is one reason if
shopping at the Best Buy, get to see a Windows desktop, and
open or create a text file, to see how LCDs look with text.
In terms of analog VGA versus digital DVI, the digital route takes
all of the final output stage of the video card out of the picture.
The components in the analog path are video DAC, video card pi
filters (for FCC EMI compliance), quality of cable (controlled
impedance). A compromise in any of those can show up on the
screen. With DVI digital, you won't see a problem until "snow"
starts to appear in the picture, implying there are severe
transmission errors. DVI takes a lot of the uncertainty out of
the path. Analog can match it, up to a certain resolution, but
there are some video cards where the analog is crappy, even at
1280x1024.
If you were using gear, where the resolution was higher than
1280x1024, for lowest risk use DVI. It doesn't mean you could
not find a good analog output from a video card, just that
finding info on which cards are good, is hard to find.
For info on how DVI works, try this article:
http://graphics.tomshardware.com/graphic/20041129/index.html
For ATI video cards, find one that lists 3.3V as an option.
ATI has removed this page from their site, and this page
doesn't list all possible models:
http://web.archive.org/web/20041014040007/http://mirror.ati.com/support/faq/agpchart.html
This page is also useful, as it has a list of video cards
and their interface type:
http://www.playtool.com/pages/agpcompat/agp.html
I'm typing this on a 17" LCD, and I'd stick with my old
CRT given the choice. One particular issue with this LCD,
is I cannot turn down the backlight enough - the light
output is too high for my viewing conditions, and I don't
have any more adjustment room left. That was not a problem
with my old Sony Trinitron.
HTH,
Paul