Right, but _how_ much sharper?
Some respondents to my previous post made it seem like the improvement
wouldn't be that much. But there's a bunch of controls on the OSD and some
I think have to do with the fact it's VGA---e.g., the coarse and fine
controls. Changing those really changes the quality, so I figure that going
DVI will improve the quality. Don't know though.
I think it depends a *LOT* on the video-card you use.
Perhaps also on the particular LCD panel, as well.
I'm using an ATI 2006 "All In Wonder" card with VGA only output.
My StarLogic 1680x1050 using VGA is *incredibly* sharp.
Actually, I don't see how it could be sharper using the DVI connector.
There's only so sharp it *can* get for a certain resolution.
I'm not sure if my good results are due to the design of this particular
LCD panel, the video-card output, or both. Knowing what goes into both,
I can see how each can have a HUGE effect on the final result. With a
resolution like mine (not THAT different from what a CRT would get with
VGA on 1600x1200 resolution), the VGA connector *should* do just fine.
If not, I'd suspect somebody on the LCD panel receiving end just didn't
do their job; because the VGA connector works just FINE for CRT displays
with several times that resolution.
I can see possibilities and probabilities for each side making a
tremendous difference. If I had a choice though, I'd go DVI, even
though *for me* I doubt it would make even the tiniest bit of
difference.