LCD Question

  • Thread starter Thread starter Guest
  • Start date Start date
G

Guest

I am running a Sapphire 1650 pro card (256mb) and am getting bleeding on the
fonts on my Viewsonic VA902B monitor. Would it be the monitor that has the
problem or could it be the video card? The fonts appear to have a red blur
between letters, but not everywhere, just in the few odd spots, and not
always the same areas. I can't seem to find a comfortable setting to
eliminate this. Has anyone else had the same experience?

TIA
 
You might try tuning your clear type settings, that helps for some systems..
See help and support 'clear type'

Michael
 
I am running a Sapphire 1650 pro card (256mb) and am getting bleeding on
the fonts on my Viewsonic VA902B monitor. Would it be the monitor that has
the problem or could it be the video card? The fonts appear to have a red
blur between letters, but not everywhere, just in the few odd spots, and
not always the same areas. I can't seem to find a comfortable setting to
eliminate this. Has anyone else had the same experience?

TIA

Yes, same problem on a Viewsonic VP930b and a BFG Nvidia 6600 GT OC This is
the second monitor and they both have the same problem but not in all
applications, I have varied font size but still have the problem.
Clear type tweaking was not supported in Vista and having it on or off made
no difference.

Regards, Rene
 
Re: "Would it be the monitor that has the problem or could it be the
video card?"

You left out a VERY common culprit .... THE CABLE.

Here are some guidelines:

-Turn off "ClearType" (which I call "fuzzytype"). It literally adds
gray fringing around characters. Why it's called "cleartype" I will
never understand.

-In the monitor, the only things you can control (and not always) are
the dot clock frequency and phase (this only applies to analog VGA
connections; DVI is always perfect). To set this right, you need a test
pattern.

-The cable is a HUGE issue. The frequency of analog video signals is
extremely high (80 to over 100 MHz). The capacitance of the cable can
cause the signal (or ONE signal, e.g. red, since each signal has it's
own wire) to be "smeared".

I'll shortly post a long post on setting the dot clock frequencyy and
phase on an analog monitor. It's worth trying that first.

The BEST suggestion is to use a monitor and video card that both support
DVI digital interfaces, all of this goes away completely.
 
Barry Watzman said:
Re: "Would it be the monitor that has the problem or could it be the video
card?"

You left out a VERY common culprit .... THE CABLE.

Here are some guidelines:

-Turn off "ClearType" (which I call "fuzzytype"). It literally adds gray
fringing around characters. Why it's called "cleartype" I will never
understand.

-In the monitor, the only things you can control (and not always) are the
dot clock frequency and phase (this only applies to analog VGA
connections; DVI is always perfect). To set this right, you need a test
pattern.

-The cable is a HUGE issue. The frequency of analog video signals is
extremely high (80 to over 100 MHz). The capacitance of the cable can
cause the signal (or ONE signal, e.g. red, since each signal has it's own
wire) to be "smeared".

I'll shortly post a long post on setting the dot clock frequencyy and
phase on an analog monitor. It's worth trying that first.

The BEST suggestion is to use a monitor and video card that both support
DVI digital interfaces, all of this goes away completely.

I am using DVI port on video card and LCD monitor, Have tried 2 new cables
with same results.

Tried the tune cleartype thing, it now works on Vista but it is no better.

Definetly looks better with cleartype (aka fuzzytype) OFF.

Regards, Rene
 
If you are using a digital interface, the problem is entirely in the
monitor unless the computer is not running at that LCD panel's native
resolution.

[Note: On SOME monitors, the DVI port is digital only; on other
monitors, it is a DVD-I port and can accept an analog input through the
DVI connector. You know what you have and what you are doing, I don't.]

With a digital interface, the monitor either works or it does not. The
computer is sending a series of number over the cable, the monitor can
determine if the numbers are correctly received or not, and if they are
not you just get a black screen. The cable CAN NOT degrade the image.
This is not true of analog ports, where the cable is one of the most
common culprits.

[Also, none of the adjustments that I discussed apply to digital
interfaces. But, again, just because it's a DVI connector does not
AUTOMATICALLY and in ALL cases mean that it's a digital interface (but
it usually is).]

If you have a digital interface, the ONLY thing that the computer can do
to degrade the image is to run run at the wrong resolution. LCD panels
should always and only be run at their native panel resolution.

Personally, I think that "ClearType" makes things worse. I find no
smoothing at all to be best. I have no idea why Microsoft is so hung up
 
Back
Top