Refresh rate question

  • Thread starter Thread starter zlo
  • Start date Start date
Z

zlo

Here's a really dumb question. Maybe so dumb, no one will answer.. Why are
people picky about what refresh rate they run their games at? Why run your
game at 85hz rather than 60hz? Or 100 rather than 85? 60hz will give you 60
frames/second right? That is plenty smooth enough for any game in the world.
So why do people like to use refresh rates of 85, 100?
 
zlo said:
Here's a really dumb question. Maybe so dumb, no one will answer.. Why are
people picky about what refresh rate they run their games at? Why run your
game at 85hz rather than 60hz? Or 100 rather than 85? 60hz will give you
60 frames/second right? That is plenty smooth enough for any game in the
world. So why do people like to use refresh rates of 85, 100?

60Hz makes my eyes want to bleed.

I can see the flicker, and even if you can't, it increases eye strain.
Admittedly it's much worse on your desktop where app backgrounds are white -
games tend to be quite dark, but still... if you can run at a higher refresh
your eyes will like you.

Ben
 
Some people seem to be sensitive to lower refresh rates, but I'm not sure if
I can tell the difference between 60hz and 85hz; however, with v-sync on,
frame rate is linked to refresh rate, and a higher frame rate can be an
advantage in some games.

In HL and all it's derivatives, the rate at which the gun reticule closes is
linked to the frame rate, so the higher the fps, the faster a player can
shoot with accuracy. One possible explanation.
 
Frank said:
Some people seem to be sensitive to lower refresh rates, but I'm not sure
if I can tell the difference between 60hz and 85hz;

Fire up a window so that most of your screen is white, then change your
refresh to 60Hz. If you look past the monitor (like just over the top, so
that your focus is beyond the front of the glass) you will probably be able
to see it flickering. This should deminish quite quickly... most people
can't see it at 72Hz... but under the right conditions, I can just about see
it at amything less than 80. So I go to 85Hz (which just so happens to be
the max my screen supports at 1600x1200).
however, with v-sync on, frame rate is linked to refresh rate,

It will be limited to it, yes.
and a higher frame rate can be an advantage in some games.

In HL and all it's derivatives, the rate at which the gun reticule closes
is linked to the frame rate, so the higher the fps, the faster a player
can shoot with accuracy. One possible explanation.

Now that is an interesting quirk.

Ben
 
It depends on the monitor and the person. I can see 75Hz flicker and it
bothers me. 85 is fine. If you run your monitor too close to it's max.
rate it may blur a bit as well...

Mike
 
Ben Pope said:
Fire up a window so that most of your screen is white, then change your
refresh to 60Hz. If you look past the monitor (like just over the top, so
that your focus is beyond the front of the glass) you will probably be able
to see it flickering. This should deminish quite quickly... most people
can't see it at 72Hz... but under the right conditions, I can just about see
it at amything less than 80. So I go to 85Hz (which just so happens to be
the max my screen supports at 1600x1200).

Likewise.. same res & rate.. not sure if quality of monitor affects this but
my ol Samsung DF957 19" is awful (reminds me of interlace mode on my old
Amiga) at anything under 75Hz and 85Hz is definitely easier to live with.
having said that, I used to own a Dell laptop with a 15" LCD screen which
ran at 1400 x something (can't quite remember) and at 60Hz it was clearer
than anything I've ever seen on a CRT display - especially using the
ClearType function of WinXp.
 
Refresh rate and frame rate are two different things.

Refresh rate is how often the monitor screen is redraws itself
(it keeps redrawing the same picture on the monitor until the computer send
it a change)
The human eye takes about 30 pictures a second
I believe flicker is the human eye catching the monitor redrawing itself.
So at about 72hz or high, the human eye can not "catch" the redraw,
therefore you do not see the flicker.
So above 72hz, the picture appears even more solid.

Frame rate is how often the computer sends a picture to monitor.
So the faster the computer turns out the redrawing of a changing pictures,
the smoother the pictures seems.
 
Sparky said:
Likewise.. same res & rate.. not sure if quality of monitor affects this
but my ol Samsung DF957 19" is awful (reminds me of interlace mode on my
old Amiga) at anything under 75Hz and 85Hz is definitely easier to live
with

I guess it depends on how long the phosphor stays alive... which could be
phosphor composition or electron beam strength. As monitors get older, the
cathode decays and less eletrons are emmitted, but that just seems to make
the display less bright. So I guess it's primarily phosphor composition.
having said that, I used to own a Dell laptop with a 15" LCD screen
which ran at 1400 x something (can't quite remember) and at 60Hz it was
clearer than anything I've ever seen on a CRT display - especially using
the ClearType function of WinXp.


Thats 'cos LCD doesn't decay like a phosphor. The transistors driving the
crystal are changing state at 60Hz, which is plenty (in terms of the
responsiveness of update to the screen), but are on throughout the entire
process. If a pixel doesn't change, then the LCD state doesn't either. No
refresh as such, only change, as such LCDs cannot flicker.

Ben
 
Tod said:
Refresh rate and frame rate are two different things.

Refresh rate is how often the monitor screen is redraws itself
(it keeps redrawing the same picture on the monitor until the computer
send it a change)
The human eye takes about 30 pictures a second

That figure probably includes colour... I suspect contrast change to be a
bit more sensitive than colour.. after all we only see movement in
greyscale. (It's a weird concept, but from studies of the visual cortex we
physically can't see movement in colour - the brain fills the colour in)
I believe flicker is the human eye catching the monitor redrawing itself.

I think you're referring to tearing, which is eliminated when waiting for
vertical synch.
So at about 72hz or high, the human eye can not "catch" the redraw,
therefore you do not see the flicker.
So above 72hz, the picture appears even more solid.

Hmm, not really sure what you mean by seeing the redraw.
Frame rate is how often the computer sends a picture to monitor.

How often the raster is updated :-) The video card controls the refresh
rate.
So the faster the computer turns out the redrawing of a changing pictures,
the smoother the pictures seems.


Yeah, smoothness of motion would be a good description, I guess.

Ben
 
Ben Pope said:
That figure probably includes colour... I suspect contrast change to be a
bit more sensitive than colour.. after all we only see movement in
greyscale. (It's a weird concept, but from studies of the visual cortex we
physically can't see movement in colour - the brain fills the colour in)
itself.

I think you're referring to tearing, which is eliminated when waiting for
vertical synch.


Hmm, not really sure what you mean by seeing the redraw.
Computer monitors can not "hold" a picture, When something is draw, it will
only last maybe one second
then fads away, the monitor has to keep refreshing the picture.
Maybe I should have kept using the word refresh instead of redraw.
 
Tod said:
Computer monitors can not "hold" a picture, When something is draw, it
will only last maybe one second
then fads away, the monitor has to keep refreshing the picture.
Maybe I should have kept using the word refresh instead of redraw.


Well... yeah. Flicker is the change in brightness as the phosphor
brightness decays over time. The amount of time for decay (to maybe half
it's peak value) is something in the order of 1/50th second. But probably
longer than 1/100th second.

Ben
 
Back
Top