D
Danny
My intention here is *not* to start a flamewar, but if merely a question
that can only be answered by posing it to both ng's.
Having recently upgraded from a Ti4600 to a Radeon 9800 Pro and also having
setup a friend's PC with a 9800 Pro too, I have to say I'm surprised at the
lack of raw performance.
My system is P4 2.4b @ 2.7, with 512MB RAM.
His is P4 3.06 with 1024MB RAM.
Basically I was sorta impressed by 3d Mark 03, Aquanax, and FSAA and AF
performance under the Radeon, but was pretty disappointed that the
performance under the card without this eye candy on wasn't really that much
greater than my Ti4600.
I even went back to the Ti4600 and found that the performance of AF and FSAA
was actually a lot better than I'd ever given it credit for.
Even though the Radeon gave me 5500 in 3DMark03, and 50 in CodeCreatures, it
didn't seem to excel in real-time gaming, where the Ti4600 is the other way
around.
I have been subsequently told that the resolution I play at (1024) and the
eye candy (AF FSAA) are major factors in performance and that my CPU is a
limiter at these settings.
What I'm basically asking is this: Is it just me or do Ati seem to excel in
benchmarks and generally get higher framerates in games (Higher roof), but
Nvidia are more consistent, if maybe a touch lower?
that can only be answered by posing it to both ng's.
Having recently upgraded from a Ti4600 to a Radeon 9800 Pro and also having
setup a friend's PC with a 9800 Pro too, I have to say I'm surprised at the
lack of raw performance.
My system is P4 2.4b @ 2.7, with 512MB RAM.
His is P4 3.06 with 1024MB RAM.
Basically I was sorta impressed by 3d Mark 03, Aquanax, and FSAA and AF
performance under the Radeon, but was pretty disappointed that the
performance under the card without this eye candy on wasn't really that much
greater than my Ti4600.
I even went back to the Ti4600 and found that the performance of AF and FSAA
was actually a lot better than I'd ever given it credit for.
Even though the Radeon gave me 5500 in 3DMark03, and 50 in CodeCreatures, it
didn't seem to excel in real-time gaming, where the Ti4600 is the other way
around.
I have been subsequently told that the resolution I play at (1024) and the
eye candy (AF FSAA) are major factors in performance and that my CPU is a
limiter at these settings.
What I'm basically asking is this: Is it just me or do Ati seem to excel in
benchmarks and generally get higher framerates in games (Higher roof), but
Nvidia are more consistent, if maybe a touch lower?