ATI gets 0wned in D00M III

  • Thread starter Thread starter Jacked
  • Start date Start date
Jacked said:
Glad I'm set :) ATI sure is pissed off. read all about it.
http://www.teamradeon.com/forum/viewtopic.php?t=295

Excuse me... but nVidia is, yet again, "cheating". They're running in
partial precision whereas ATi runs in full precision. No wonder nVidia
scores better. Further more, ATi has been programming a brand-new OpenGL
drivers which will certainly improve framerates...

ATI response:
"Hi Richard - this is a non issue - Doom 3 isn't even available yet, and we
all know that some of our competitors use partial precision where possible.
We expect to have updated drivers available in the coming weeks."

UPDATE: "...And btw, let's not lose sight of the fact that ATI performance
isn't relatively poor at all. I think Kyle himself said that even the
X800pro delivered 'great' performance, and Carmack said in the HardOCP
article that there's more to consider than just frame rate. The frame rate
difference even today is so minor, it's impossible to tell without
diagnostic tools - ie: the end user experience isn't affected. And with ATI
you get full-precision enabled all the time - we don't do PP (on R3XX and
R4XX) like some of our competitors. It's also important to note that most of
today's games play faster on ATI hardware, and you can expect that to extend
to other 'big title' games expected this summer. Chris"

Chris Hook

"We are always looking for ways to improve our products. And a key part of
that is our software. Many of the improvements that we have made in the
software area are already evident. For example, we have improved our testing
processes, resulting in more frequent and more stable drivers and we have
implemented a feedback program to learn of driver bugs early on so we can
rectify them. Some of the improvements aren't evident yet, but will be soon.
It's readily apparent that the work we've done on the DX driver is world
class. We are confident we will do whatever is necessary to make our OGL
driver the new industry standard for stability and performance."

Dave Orton

Geez... nVidia uses partial precision just so they can be faster than ATi.
What a bunch of losers...

=- Brian Dickens, the Netherlands
 
its a known fact that nvidia's open gl implimentation is the best and most
mature on the market
ati's opengl has allways sucked compared to nvidia's
but ati are slowly improveing it

hey atleast ati are trying

you have to remember nvidia have allways had a1 d3d and ogl drivers even
since the days of nvidia riva 128
and ati rage pro

so its a case of cat an mouse ati has to catch up as such

but most of theyr work in the past has been d3d based seems theyr putting
efforts into ogl latley

now whilst im not holding my breath i expect some improvements soon
 
Thanks for the post Brian, you turned a stupid troll into a good condensed
bit of information.

Mike

Brian Dickens said:
Jacked said:
Glad I'm set :) ATI sure is pissed off. read all about it.
http://www.teamradeon.com/forum/viewtopic.php?t=295

Excuse me... but nVidia is, yet again, "cheating". They're running in
partial precision whereas ATi runs in full precision. No wonder nVidia
scores better. Further more, ATi has been programming a brand-new OpenGL
drivers which will certainly improve framerates...

ATI response:
"Hi Richard - this is a non issue - Doom 3 isn't even available yet, and we
all know that some of our competitors use partial precision where possible.
We expect to have updated drivers available in the coming weeks."

UPDATE: "...And btw, let's not lose sight of the fact that ATI performance
isn't relatively poor at all. I think Kyle himself said that even the
X800pro delivered 'great' performance, and Carmack said in the HardOCP
article that there's more to consider than just frame rate. The frame rate
difference even today is so minor, it's impossible to tell without
diagnostic tools - ie: the end user experience isn't affected. And with ATI
you get full-precision enabled all the time - we don't do PP (on R3XX and
R4XX) like some of our competitors. It's also important to note that most of
today's games play faster on ATI hardware, and you can expect that to extend
to other 'big title' games expected this summer. Chris"

Chris Hook

"We are always looking for ways to improve our products. And a key part of
that is our software. Many of the improvements that we have made in the
software area are already evident. For example, we have improved our testing
processes, resulting in more frequent and more stable drivers and we have
implemented a feedback program to learn of driver bugs early on so we can
rectify them. Some of the improvements aren't evident yet, but will be soon.
It's readily apparent that the work we've done on the DX driver is world
class. We are confident we will do whatever is necessary to make our OGL
driver the new industry standard for stability and performance."

Dave Orton

Geez... nVidia uses partial precision just so they can be faster than ATi.
What a bunch of losers...

=- Brian Dickens, the Netherlands
 
Partial precision by itself is not cheating. By carefully tweaking the
shader programs, it is possible to achieve identical output by substituting
certain 32-bit elements of the program with 16-bit. Whether nVidia is doing
this, or degrading the output, is a whole other story.

Remember, all this is just a technicality. In Microsoft's DX9 specs, 24-bit
FPP is regarded as "full-precision". There's no such rule in OpenGL. It can
be argued that 24-bit is not enough and there are certain instances where
only 32-bit FPP can offer true visual fidelity.

On the other hand, in John Carmack's testing, "some of the ATI cards did
show a performance drop when colored mip levels were enabled, implying some
fudging of the texture filtering". Everyone's got a skeleton in his
closet...
 
Jacked said:
Glad I'm set :) ATI sure is pissed off. read all about it.
http://www.teamradeon.com/forum/viewtopic.php?t=295

CPU: AMD Athlon 64 Processor 3400+
Video Card: nVidia BFG GeForce 6800 GT OC
Ram: 1024 Corsair XMS series


I have a 9800 Pro, so I'm set too. ;-)

However, it's a known fact that nVidia currently has better performance in
OpenGL. ATI has been re-writing their OpenGL drivers and we'll see what
they are like when they are complete. Personally though, I think they
should have been on the ball sooner. But since most games are written
with DirectX in mind rather than OpenGL, I'm not too terribly concerned.
 
A lot of the discussion is on OpenGL. Is Doom3 OpenGL only or does it also
support D3D? If so are there any benchmark comparisons?
 
heh you twits are going to feel retarded when the game actually comes out
and ATI reams just as good or better than Nvidia ;) I mean CMON you tards -
the games not even finished yet. We get this same kind of stupidity with
every new 'benchmark' game. Grow up for a change.
 
heh you twits are going to feel retarded when the game actually comes out
and ATI reams just as good or better than Nvidia ;) I mean CMON you tards -
the games not even finished yet. We get this same kind of stupidity with
every new 'benchmark' game. Grow up for a change.

Doom 3 is finished. "Gone Gold" as it were. Whatever the case may be,
it's a good thing for nVidia and ATI to pass the performance torch back
and forth between themselves.
 
I agree, its great that both these ruling companies are fighting for the
lead - we, as gamers, can only benefit from it. As for as it being 'gold' -
nice, but I wonder how long a patch will be in order from either company ;)
I guess we cant be critical of such things, though. If the game lives up to
its hype, which I'm pretty sure it would be - considering its a Carmack job,
then we'll all be damn happy and careless of which card we have.
 
Blaedmon said:
heh you twits are going to feel retarded when the game actually comes
out and ATI reams just as good or better than Nvidia ;) I mean CMON
you tards - the games not even finished yet. We get this same kind of
stupidity with every new 'benchmark' game. Grow up for a change.


hehe

FWIW, I think that the whole ID set of games are all based around a totally childish set of points
of reference, totally gung ho with no subtlty at all. Carmack tries his best to clothe the half
baked storyline with some tech-emporers new clothes, but they age very quickly. His engines always
go for the eye candy to the detriment of AI. I see *no* real mention of AI in any of the
literature, and from the videos, it all looks scripted anyway.

When HL1 came out it was a breath of fresh air. I expect the same of HL2. Not due to the graphics,
but the storyline, facial animation instead of better modelled weapons, and of course, a much better
squad based AI (especially in the HL troopers).

Its like comparing the first and last Matrix films :)

S
 
Back
Top