Graphics cards and FPS

  • Thread starter Thread starter DJS0302
  • Start date Start date
D

DJS0302

I know from personal experience that at 30 FPS a game looks okay, not great,
just okay. Between 20 and 29 FPS a game is playable but you can tell it's
choppy. Anything under 20 FPS and the game is too choppy. Now what about the
other end of the scale? Is there a point where you can no longer tell a
difference between framerates? For example, can you see a difference if a game
is running at 100 FPS versus 120 FPS?
 
DJS0302 said:
I know from personal experience that at 30 FPS a game looks okay, not
great, just okay. Between 20 and 29 FPS a game is playable but you
can tell it's choppy.

Now I find this strange. Big-screen movies are only 25 FPS, they never look
choppy.
Anything under 20 FPS and the game is too
choppy. Now what about the other end of the scale? Is there a point
where you can no longer tell a difference between framerates? For
example, can you see a difference if a game is running at 100 FPS
versus 120 FPS?

I don't know, I've never had a card that can do those sorts of frame-rates
in games where it matters.
 
DJS0302 said:
I know from personal experience that at 30 FPS a game looks okay, not great,
just okay. Between 20 and 29 FPS a game is playable but you can tell it's
choppy. Anything under 20 FPS and the game is too choppy. Now what about the
other end of the scale? Is there a point where you can no longer tell a
difference between framerates? For example, can you see a difference if a game
is running at 100 FPS versus 120 FPS?
Umm no you can't see a difference between 100 -120 FPS. You can't tell a
difference after 40FPS. Atleast thats what the people with the little white
coats tell me.
 
I know from personal experience that at 30 FPS a game looks okay, not great,
just okay. Between 20 and 29 FPS a game is playable but you can tell it's
choppy. Anything under 20 FPS and the game is too choppy. Now what about the
other end of the scale? Is there a point where you can no longer tell a
difference between framerates? For example, can you see a difference if a game
is running at 100 FPS versus 120 FPS?

The key is not the average framerate, but what the minimum rate
is. It's fairly irrelevant if card averages 40 FPS if it often
dips down below 20 FPS. Too often this is ignored in benchmarks.

As for max, it can depend on how sharp your eyes are... some
people can't even notice a flicker from a 75Hz CRT monitor
refresh rate but others need at least 100Hz to keep it from being
distracting. "Some" people consider 50 FPS to be good enough,
that there is little to no noticable benefit to higher rate, but
again that would be if it never drops below 50 FPS, not the
average. There is no point to having 100-120FPS, at that point
you might as well turn up the eye-candy another notch, as it
would make more of a visual improvement than a few FPS lost.
 
Chris said:
.... snip ...

Umm no you can't see a difference between 100 -120 FPS. You
can't tell a difference after 40FPS. At least thats what the
people with the little white coats tell me.

Are those the people taking care of the Mad Hatter and other
denizens of this group?
 
As a competitive gamer (yeah yeah, flame me all you want) I can very easily
tell the difference between 60FPS and 100FPS, let alone 40. When you play a
game at the same resolution, FPS, and refresh rate, you get used to it
pretty quick and can tell when things look different. I cannot look at
monitors at 60Hz for more than a few minutes until I feel dizzy.

Say anything you want, but for the game I play at 100 FPS, I cannot play
half as well with 60 FPS. And regarding the chap about movies at 25 FPS, it
appears smoother on film because of motion blur - something games (none that
i know of, anyway) don't have.

--
 
I know from personal experience that at 30 FPS a game looks okay, not great,
just okay. Between 20 and 29 FPS a game is playable but you can tell it's
choppy. Anything under 20 FPS and the game is too choppy. Now what about the
other end of the scale? Is there a point where you can no longer tell a
difference between framerates? For example, can you see a difference if a game
is running at 100 FPS versus 120 FPS?

Thats a debate thats been going on forever. Theres always these people
who compare it film and say you cant tell the difference above 25-30
fps and others , you can see them in this thread - who say the higher
the better.

I think I went a bit overboard since I havent had a competitive card
for ages didnt think it was worth it , so despite the fact Id heard
about the 800XT and 6800 Ultra - I think expected perfect performance
with my now middle of the road 9800 ATI.

However running DOOM now I can get OK gameplay for me , at higher than
640x480 as long as I dont use FSAA.

I guess Id like to try out a 800XT or 6800 now to see how much better
it is.
 
Thats a debate thats been going on forever. Theres always these people
who compare it film and say you cant tell the difference above 25-30
fps and others , you can see them in this thread - who say the higher
the better.

I think I went a bit overboard since I havent had a competitive card
for ages didnt think it was worth it , so despite the fact Id heard
about the 800XT and 6800 Ultra - I think expected perfect performance
with my now middle of the road 9800 ATI.

However running DOOM now I can get OK gameplay for me , at higher than
640x480 as long as I dont use FSAA.

I guess Id like to try out a 800XT or 6800 now to see how much better
it is.
I'm no expert, in fact I'm a newbie to most of this stuff, but one thing
that struck me whilst I was reading through this thread was that you
can't directly compare the FPS rates of film & computer graphics because
on a computer game, the spatial antialiasing has to be added by the
programmers (ie) on a film of something that is fast moving, each frame
will show the object as being blurred due to the movement of the object
during the shutter time. On a computer game, this blurring has to be
added by the graphics artists involved & I should imagine that it's
quite an art to do it in such a way as to make the end result look
convincing to the human eye, so whilst a film at 25FPS looks continous
to the human eye, much higher FPS's will be needed for the same effect
on a pc game!
 
Back
Top