Newer cards still CPU bound?

  • Thread starter Thread starter RT
  • Start date Start date
R

RT

My XL800 recently quite working and I replaced it with an ATI X1950Pro which
unfortunately is still sitting behind me waiting for a 6 pin pci-e
connector. My computer is a AMD 64 X2 +4200 and out of curiosity I was
wondering how much that is going to slow down the new card as opposed to
having a much faster cpu. I run 1280 x 1024 LCD and that is unlikely to
change for the near future.

RT
 
My XL800 recently quite working and I replaced it with an ATI X1950Pro which
unfortunately is still sitting behind me waiting for a 6 pin pci-e
connector. My computer is a AMD 64 X2 +4200 and out of curiosity I was
wondering how much that is going to slow down the new card as opposed to
having a much faster cpu. I run 1280 x 1024 LCD and that is unlikely to
change for the near future.

RT

I always wonder how many people buy an expensive gaming card and
then run at their refresh rate instead of what the card can do.

I wouldn't worry about it at 1280x1024. The nvidia top of the line
cards are cpu bound, but the ati cards aren't affected as much. The
bad news is a better cpu probably won't help you that much in the
future.
 
> said:
I always wonder how many people buy an expensive gaming card and
then run at their refresh rate instead of what the card can do.

"and then run at their refresh rate" ?? Do you mean the refresh rate
that the LCD can handle? Are you suggesting that you can run at
the highest rate of the "expensive gaming card" without regard to
the LCD's limitations? Do you mean there is no benefit to buying an
"expensive gaming card" if you are going to have to run it at the LCD's
limited refresh rate?
I wouldn't worry about it at 1280x1024. The nvidia top of the line
cards are cpu bound, but the ati cards aren't affected as much. The
bad news is a better cpu probably won't help you that much in the
future.

Because of the LCD's limited refresh rate?

Just curious, as I use a LCD that is limited to 60 Hertz screen
refresh rate. (A Viewsonic VX2025wm)

Luck;
Ken
 
I always wonder how many people buy an expensive gaming card and
then run at their refresh rate instead of what the card can do.

and the point of running your video card above your screens refresh rate is
?

other than benchmarking, none at all
 
Ken Maltby said:
"and then run at their refresh rate" ?? Do you mean the refresh rate
that the LCD can handle? Are you suggesting that you can run at
the highest rate of the "expensive gaming card" without regard to
the LCD's limitations? Do you mean there is no benefit to buying an
"expensive gaming card" if you are going to have to run it at the LCD's
limited refresh rate?


Because of the LCD's limited refresh rate?

Just curious, as I use a LCD that is limited to 60 Hertz screen
refresh rate. (A Viewsonic VX2025wm)

Luck;
Ken

I always wondered the same, there's absolutely no use running a game
at more fps than the refresh rate of your monitor.
If your screen is showing 60 frames per second, it's showing 60 frames
per second max. Period.
And if your gfx card is pumping more fps, you will see the 'tearing' effect
althoug I must say I hardly notice it.
(http://hifi-india.blogspot.com/2006/11/faq-what-is-screen-tearing-and-how-d
o-i.html)


On the other hand, it's the *minimum* fps count that matters.
A game running constantly at 40fps is much more 'smooth' than a game
running at 200+ with regular drops to 20 or less.
And for a good guaranteed minimum fps with recent games, you need a
top notch card.


just my 2 cents.

regards,
Marcel
 
"Ken Maltby" <[email protected]> schreef in bericht











I always wondered the same, there's absolutely no use running a game
at more fps than the refresh rate of your monitor.
If your screen is showing 60 frames per second, it's showing 60 frames
per second max. Period.
And if your gfx card is pumping more fps, you will see the 'tearing' effect
althoug I must say I hardly notice it.
(http://hifi-india.blogspot.com/2006/11/faq-what-is-screen-tearing-and...
o-i.html)

On the other hand, it's the *minimum* fps count that matters.
A game running constantly at 40fps is much more 'smooth' than a game
running at 200+ with regular drops to 20 or less.
And for a good guaranteed minimum fps with recent games, you need a
top notch card.

just my 2 cents.

regards,
Marcel- Hide quoted text -

- Show quoted text -

So your saying that my game that is now running 135 fps now I'm
running two X1900's in Crossfire compaired to the 75 fps when I used
just one X1900 didn't make a bit of difference since my LCD refresh
rate is only 60 Mhz? Don't think so bud, vast improvement in gameplay
here.
 
Custom Computers said:
So your saying that my game that is now running 135 fps now I'm
running two X1900's in Crossfire compaired to the 75 fps when I used
just one X1900 didn't make a bit of difference since my LCD refresh
rate is only 60 Mhz? Don't think so bud, vast improvement in gameplay
here.

Still waiting for the "Experts" like Mr. J. Clark or Chuck U. Farley
to provide their stance on this issue.

Luck;
Ken

P.S. If the venerable Barry Watzman could provide one of
his very accurate, if somewhat rigidly doctrinaire, replies;
I, for one, would be interested.
 
Custom Computers said:
So your saying that my game that is now running 135 fps now I'm
running two X1900's in Crossfire compaired to the 75 fps when I used
just one X1900 didn't make a bit of difference since my LCD refresh
rate is only 60 Mhz? Don't think so bud, vast improvement in gameplay
here.

A *constant* 75fps?

If you have 'Vertical Sync' enabled, you might GET 60 (or less) fps and
you will SEE 60 (or less).new display images per second

If you don't have VS enabled, you might GET 60 or more (or less) but
still SEE only 60 new display images per second.

Without VS enabeld, most display images will contain bands, one band
will have the information for frame number X, the next band will have to
info for frame number X+1 and so on. (the tearing effect I talked about).

So if you would compare a true constant 135fps with a true constant 75fps,
yes, you wouldn't see any difference in playability except that the 75fps
card
would have less bands and so less tearing.

And yes, here I do say that running at 75fps (constant!) is BETTER than
running at 135fps (again, constant!). And 60fps would be PERFECT with
your monitor..

Don't know if you ever played games on computers like an Amiga or Atari.
Can't compare the graphics with todays games, but those games were
running 50? (I think it was 50) fps, and they were running smooooooth.
The reason: most of those games were build to run at 50fps, not one frame
more, not one frame less.

But that's all theoretical nonsense now because no PC game will ever run
at a constant amount of fps. And therefore, your crossfire setup will
increase
the minimum fps and will make the game run much much better :)

Just curious, I don't have the hardware (yet) to run much games at 100+ fps.
Do you know if games will look or play better on your machine if you enable
Vertical Sync?
I'm still addicted to the original Unreal Tournament, and with VS enabled I
think it's running a bit more smooth.. but that could be just between my
ears.

Maybe there's also another point to take into consideration.
Don't know that much about game engines, but if the AI of the game also runs
once every frame, the AI will be more responsive when running at higher
frame rates. Maybe that's also giving the game a boost in playability.


regards,
Marcel
 
Marcel Overweel said:
A *constant* 75fps?

If you have 'Vertical Sync' enabled, you might GET 60 (or less) fps and
you will SEE 60 (or less).new display images per second

If you don't have VS enabled, you might GET 60 or more (or less) but
still SEE only 60 new display images per second.

Without VS enabeld, most display images will contain bands, one band
will have the information for frame number X, the next band will have to
info for frame number X+1 and so on. (the tearing effect I talked about).

So if you would compare a true constant 135fps with a true constant 75fps,
yes, you wouldn't see any difference in playability except that the 75fps
card
would have less bands and so less tearing.

And yes, here I do say that running at 75fps (constant!) is BETTER than
running at 135fps (again, constant!). And 60fps would be PERFECT with
your monitor..

Don't know if you ever played games on computers like an Amiga or Atari.
Can't compare the graphics with todays games, but those games were
running 50? (I think it was 50) fps, and they were running smooooooth.
The reason: most of those games were build to run at 50fps, not one frame
more, not one frame less.

But that's all theoretical nonsense now because no PC game will ever run
at a constant amount of fps. And therefore, your crossfire setup will
increase
the minimum fps and will make the game run much much better :)

Just curious, I don't have the hardware (yet) to run much games at 100+
fps.
Do you know if games will look or play better on your machine if you
enable
Vertical Sync?
I'm still addicted to the original Unreal Tournament, and with VS enabled
I
think it's running a bit more smooth.. but that could be just between my
ears.

Maybe there's also another point to take into consideration.
Don't know that much about game engines, but if the AI of the game also
runs
once every frame, the AI will be more responsive when running at higher
frame rates. Maybe that's also giving the game a boost in playability.


regards,
Marcel

Try some double and triple buffering.

Luck;
Ken
 
Try some double and triple buffering.

Luck;
Ken- Hide quoted text -

- Show quoted text -

Well in the sim I'm running Nascar 2003 triple buffering would require
the use of Open GL and its running in Direct 3D. I do have it set to
14X for anti aliasing and 16 X for antisotropic filtering, mipmap
detail is set to quality, vertical refresh is set to always on.

I've just started playing Rainbow Six Las Vegas at the same settings
and all game settings maxed out. It looks and plays great but I don't
know if you can check frame rates in this game.
 
Back
Top