Lenny said:
What approach? What are you talking about? Provide evidence of ATi
"pioneering" application cheating, thank you.
The "evidence" is that ATI was first caught on cheating in good old
3DMark2001 a couple years ago. Nobody really paid much attention to the
discovery because at that time this topic wasn't as hot as it is now.
You can either do some Google searches yourself or simply test even the
_current_ ATI's drivers for the presence of 3DMark2001 cheats - you'll
be surprised.
By the way, S3 I think, was caught red-handed by cheating in Winbench by not
rendering all the frames in that program's 3D test (which was terribly lame
even by those days' standards), that should give you a hint about how long
ago THAT was. A further hint is that it was pre-Savage era too.
So I would suggest you take those lies of yours and shove em.
Which cheatS in particular are you talking about? The only
application-specific optimization ATi did for 3DMark 2003 was to re-order a
shader for the Mother Nature test. It still produced the same output
(differing in about four pixels out of a 1024*768 screen), only difference
was it ran better on their hardware.
Exactly. The Nature test. This link leads to the high-resulution picture
showing the difference between the real (cheats disabled) and the
"optimized" (cheats enabled) picture produced by ATI cards for Nature test
http://www.ixbt.com/video2/images/antidetect/r300-difference.rar
Sorry to rain on your delusions, but this is a lot more than "four pixels".
And this link leads to the same type of picture produced by nVidia card
http://www.ixbt.com/video2/images/antidetect/nv25-difference.rar
The similarities are striking. While I can't deduce all "optimizations"
used by ATI by just looking at this picture, it is rather likely that in
both nVidia and ATI case they include forceful reduction of precision of
trigonometric calculations, which is activated for this particular test.
Instruction re-ordering is not a cheat. Main processors have done
re-ordering of instructions for over a decade now, it's a common enough
procedure. Only real difference is that GPUs lack the neccessary hardware to
do it in real-time (it is extremely costly in not only transistors and die
area, but also in research and development), so it has to be done in
software in the driver's shader compiler.
Which cheats are those, exactly?
Are you suggesting ATi is untruthful in their statements that they have no
application-specific optimizations in their drivers?
Which statements are taking about? I hope you remember that when ATI was
publically confronted with the facts showing that they do use Futuremark
cheats in 3DMark2003, ATI responeded with new version of their drivers
and _publically_ _stated_ _in_ _their_ _press-release_ _that_ _ATI_
_drivers_ _did_ _actually_ _contain_ _these_ _cheats_ and now they are
removed. I hope the fact that ATI publically acknowledged the presence
of Futuremark cheats in their drivers answers your "which cheats are
those" question.
A simple experiment shows that ATI did indeed remove _these_
_particular_ cheats from their drivers, while the older 3DMark2001
cheats are still there is their full glory. You could easily repeat all
these experiments at home, if you weren't that ignorant.