In "So It's Come To This?"
I was wondering what the general consensus is on this. I have my resolution
set to 1152x864 (32 bit color) and have my RR set to 75hz (max). I've tried
lowering it, but don't notice any difference. Is there a big difference
between settings? Say between 75 and 70? Thanks.
If you set your refresh rate to 60Hz, that means the monitor is updated
sixty times per second. Some people claim that human eyes can't perceive
more than sixty frames of animation per second, or 60Hz refresh on a
monitor, but some of us actually can. I have to set my monitor to 75Hz to
get rid of that annoying flicker.
The refresh rate of your monitor can vary depending on its quality. My
monitor will refresh at 85Hz at a resolution of 1920 x 1440. Some cheaper
monitors can only refresh at 60Hz at 1024 x 768. Some monitors can only
refresh at 85Hz when at 800 x 600. My monitor refreshes at 160Hz at 800 x
600, but I never run it at that resolution because I hate it and I need a
lot of workspace.
Now, enter frame rate of the game you're playing. For the sake of making a
point, suppose your monitor can refresh at 85Hz but your graphics card can
only pump out 75 frames per second. It doesn't hurt to have your monitor
refresh at 85Hz, but you wouldn't want to set your refresh rate to anything
lower than 75Hz because then you'd be shortchanging yourself by missing
frames your graphics card is perfectly capable of delivering to you.
HTH,
Damaeus