Refresh rates

  • Thread starter Thread starter Chuck
  • Start date Start date
C

Chuck

Is there a downside to having the highest possible refresh rate? I never
had a problem with flicker, etc. However, I just changed from 75 to 85. I
see no difference. I just jumped up to 100. Is there any downside?
 
Chuck said:
Is there a downside to having the highest possible refresh rate? I never
had a problem with flicker, etc. However, I just changed from 75 to 85.
I
see no difference. I just jumped up to 100. Is there any downside?

Not really, If you are running with a CRT display, go for the maximum that
your monitor allows at a given resolution.

You may notice a blurring effect that can be caused by the monitor being
near it's limit, the cable being near it's limit or the output filtering on
your card being near it's limit.

Some of the earlier GForce cards needed modification to produce crisp
results at high refresh/high res I've hacked more Gforece 2 GTS's than I
care to remember in that way.

TFT's are a slightly different matter. There is nothing to be gained by
going higher than the rated speed for that panel as with current TFT
technology you are banging up against the fastest pixel response times even
at 60HZ. In fact some analog VGA TFT's really struggle to get their clocks
set right if you do not run at the suggested refresh rates (often in this
case you will see a sort of banded blurring across the display)
 
Is there a downside to having the highest possible refresh rate? I never
Not really, If you are running with a CRT display, go for the maximum that
your monitor allows at a given resolution.

Thanks for the info.

I don't see any blurring nor do I see a better, clearer picture. I have a
GF4 Ti 4800Se and a NEC Multisync 90.
 
Chuck said:
I don't see any blurring nor do I see a better, clearer picture. I have a
GF4 Ti 4800Se and a NEC Multisync 90.

60HZ looks terrible, I get my old headaches back just thinking about it !
75HZ and up looks pretty much the same depending on lighting conditions and
the size of the screen. From a technical standpoint more is better, however
it won't improve the picture (go too far and it can actually worsen it)
just reduce the "flicker". The flicker (that almost everyone can see at
60HZ and some can see at higher rates) can cause fatigue, nausea, headaches
and eye strain and if detectable quickly makes a display very unpleasant to
use.

If it looks good to you and you can't see any ill effects then run with 100

Note*

Older monitors can actually be damaged by messing with the timings outside
the specification. Decent modern units have safety limits that will cut in
and shut the screen down beforehand though.
 
You are working your video card's processing chip harder and harder and it's
getting hotter and hotter.
 
DaveW said:
You are working your video card's processing chip harder and harder and
it's getting hotter and hotter.
Please correct me if I am wrong here.

The way I understood GFX cards work is that there is a device called a
RAMDAC this forms an analogue VGA signal based on a single page of display.
It then refreshes this until that page gets updated.

That is why when your framerate in Quake falls to 24FPS your monitor refresh
stays at 75HZ (or whatever)

Therefore then only thing you are working harder on the card is the RAMDAC
and the output filtering. neither of which produce much heat.

By setting your refresh rate to 100HZ you are not forcing windows to
actually update the display 100 times a second, it only relates to the
analogue singal going down the VGA cable to your monitor, and therefore
relates to how quickly the monitor refreshes it's output.
 
Is there a downside to having the highest possible refresh rate? I never
had a problem with flicker, etc. However, I just changed from 75 to 85. I
see no difference. I just jumped up to 100. Is there any downside?
In addition to the less frivolous points already made, at higher refresh
rates some monitors emit a really irritating whine.

HTH.

CK
 
Back
Top