When will I *need* a Directx 9 card?

  • Thread starter Thread starter John
  • Start date Start date
Dark said:
Well if the card is pre-modded and it works, it's definitly a nice
thing! And the probability that he tries to sell you a card with
"artifacts" is rather low nowadays as the 9700 Pro is still superfast
yet... newer cards are out. Same way I got my 9500 Pro, from a gamer
wich needed a new card ( he took the 9800 Pro ).

Cheap, efficient and the card works perfect... ( and overclocks with
ease to 9700 Pro speeds )

unfortenetly the guy probbably lied to me because i tryed omega drivers and
some others and emedietly after i enter the windows i hawe a green chess
board (on some parts of the desktop and fonts). i tryed calling him and he
told me that he tryed omega drivers and they worked ok. eny thoughts. isn't
there some tricks what to do because i heard that some people got rid of
artifacts by disabling hyper z or something like that if i remember
correctly. what exacly is hyper-z and is it important for performane or only
image quality. thanks.
 
What? The Xbox's GPU is a mix between a Geforce3 and Geforce4 - some
features not even in the Geforce4. It is certainly nowhere near a GF4MX.

What are you talking about?!

The GF-4mx is below the abilities of a GF3. the 4mx series is
nothing more than an extended GF2 line with a new label (added dual
output support, up the clock speed), it has NONE of the DX8 features
of a GF3 or GF4-Ti.
 
What are you talking about?!

The GF-4mx is below the abilities of a GF3. the 4mx series is
nothing more than an extended GF2 line with a new label (added dual
output support, up the clock speed), it has NONE of the DX8 features
of a GF3 or GF4-Ti.

I think the MX was a nice value-level card though, and it did AA
nicely for me...and pretty overclockable.
 
Nitz Walsh said:
What? The Xbox's GPU is a mix between a Geforce3 and Geforce4 - some
features not even in the Geforce4. It is certainly nowhere near a GF4MX.

Does it have pixel shaders, though?

All "true" geforce3 and 4's have pixel shaders, the only exception
being the gf4mx (which is a mix of parts from Geforce2, 3, and 4,
although the 3D core is from the geforce2).

AFAIK the Xbox is based on a stripped Geforce4 MX440 core - which
should mean that it has no quincunx FSAA, no true anisotropic
filtering, no pixel shaders.

So - does the water in Morrowind have realistic-looking reflections?
 
All "true" geforce3 and 4's have pixel shaders, the only exception
being the gf4mx (which is a mix of parts from Geforce2, 3, and 4,
although the 3D core is from the geforce2).

GF4MX chips are basically GF2MXes with half the memory controller and the
antialiasing engine from the GF4 Ti. That's as far as similarities go.
AFAIK the Xbox is based on a stripped Geforce4 MX440 core

This is incorrect.

A stripped GF4 MX wouldn't leave you with much. No, XGPU is basically a GF3
with a second vertex shader and some extra pixel and vertex shader
instructions, minus the AGP bridge (since it doesn't need one) plus a CPU
interface and hypertransport link, and probably some extra bits and pieces.
which should mean that it has no quincunx FSAA, no true anisotropic
filtering, no pixel shaders.

Well it does have all those things. Also, GF4 MX does do quincunx AA and
anisotropic filtering.
 
Geronimo_work said:
unfortenetly the guy probbably lied to me because i tryed omega drivers and
some others and emedietly after i enter the windows i hawe a green chess
board (on some parts of the desktop and fonts). i tryed calling him and he
told me that he tryed omega drivers and they worked ok. eny thoughts. isn't
there some tricks what to do because i heard that some people got rid of
artifacts by disabling hyper z or something like that if i remember
correctly. what exacly is hyper-z and is it important for performane or only
image quality. thanks.

artifacts and checkers come from overheated memory, install more
cooling! Or from broken pipelines ( that is why 8 are on the card and
only 4 active ), in case 9500 chips showed broken pipelines they just
kept the 4 best ones and made the 9500 chip into a Non Pro.

There is NO garantuee that a mod runs perfect..... nor is there a
garantue that overclocking a 9500 Pro leads to a stable 9700 Pro.

Overclocking and modding are both things that MIGHT work but doesn't
has to work!

HyperZ III is a technology that uses "smarter" ways of handling
graphic memory operations to allow as much as possible data to flow
between the graphic memory/gpu/agp connection.

It's like winzipping things first before sending it away. It mainly
compresses textures and since textures are a heavy part of the graphic
workload thus allow the card to do just that bit more.
 
Dark said:
artifacts and checkers come from overheated memory, install more
cooling! Or from broken pipelines ( that is why 8 are on the card and
only 4 active ), in case 9500 chips showed broken pipelines they just
kept the 4 best ones and made the 9500 chip into a Non Pro.

i'll try to put some mem coolers but i disn't overlock only moded so i don't
think it's that
There is NO garantuee that a mod runs perfect..... nor is there a
garantue that overclocking a 9500 Pro leads to a stable 9700 Pro.

I agree. i'm just trying to find some way if possible to get rid of
artifacts because i heard that some people did it
Overclocking and modding are both things that MIGHT work but doesn't
has to work!

i agree again
HyperZ III is a technology that uses "smarter" ways of handling
graphic memory operations to allow as much as possible data to flow
between the graphic memory/gpu/agp connection.

It's like winzipping things first before sending it away. It mainly
compresses textures and since textures are a heavy part of the graphic
workload thus allow the card to do just that bit more.

thanks for the info.
 
Does it have pixel shaders, though?

All "true" geforce3 and 4's have pixel shaders, the only exception
being the gf4mx (which is a mix of parts from Geforce2, 3, and 4,
although the 3D core is from the geforce2).

Where did you come up with this bad and severely incorrect
information? The 4MX series are nothing more than re-named GF2 cards
with optional dual output. Nothing more. They have no features of
the GF3 or true GF4 cards.
AFAIK the Xbox is based on a stripped Geforce4 MX440 core - which
should mean that it has no quincunx FSAA, no true anisotropic
filtering, no pixel shaders.

If the XBOX was based on a MX440 core, it would be a total piece of
shit video game system, It wouldn't even be half the system it is
today. The MX440 is sub-standard in performance and ability to a
GF3-Ti200 which sold for about the same price when the 440 was new

(much like the 5200 is usually more expensive than the faster 4200)

Even the stupid 460MX which was slightly faster in some games over the
GF3-Ti200 with brute force of a high clock rate - it was still a DX7
card and cost almost as much as the Ti4200 - talk about SUCKERS!

PS: A "Stripped" 440mx = 420mx which is on performance with the
GF2-mx400 - but a hair faster.
 
As I recall, the Xbox video system was based on a variant of the GF3 with
some optimizations, which later found their way into the GF4 line of cards,
ti, not MX, which as you stated are just GF2 cards with enhanced memory
handling.

JK
 
What are you talking about?!

"Nowhere near" a GF4MX - meaning that it's _far superior_, you incorrectly
inferred from my post that I meant it was inferior, which was the opposite.
 
Lenny said:
A stripped GF4 MX wouldn't leave you with much. No, XGPU is basically a GF3
with a second vertex shader and some extra pixel and vertex shader
instructions, minus the AGP bridge (since it doesn't need one) plus a CPU
interface and hypertransport link, and probably some extra bits and pieces.

_Vertex_ shaders? Aren't those normally done in software?

Does XBox have pixel shaders?
Seriously, the screenshots I've seen suggest that it DOESN'T have
pixel shaders. Am I wrong? Does the water in XBox morrowind look as
good as water in PC morrowind with a GF3 or GF4Ti? Or does it look
like water in PC morrowind with a Gf2 or Gf4mx?
 
Darthy said:
Where did you come up with this bad and severely incorrect
information? The 4MX series are nothing more than re-named GF2 cards
with optional dual output. Nothing more. They have no features of
the GF3 or true GF4 cards.

No 3d features, but you may want to check on the memory architecture.
Also, the 4MX's 440 and 460 are at least twice the speed of any card
sold as a Geforce2 - the 4MX's are quite _fast_ they're just lacking
in _features_.
If the XBOX was based on a MX440 core, it would be a total piece of
shit video game system, It wouldn't even be half the system it is
today. The MX440 is sub-standard in performance and ability to a
GF3-Ti200 which sold for about the same price when the 440 was new

A 440MX is actually about the same speed as a Geforce3Ti200, and is
more than sufficient for a system with a 733MHz intel Celeron CPU and
intended to run at very low (TV) resolutions.
 
_Vertex_ shaders? Aren't those normally done in software?

Not on cards with hardware acceleration for them, which is every Nvidia chip
from GF3 and up.
Does XBox have pixel shaders?

I already answered yes to that. Of course it does, since the XGPU is based
on the GF3.
Seriously, the screenshots I've seen suggest that it DOESN'T have
pixel shaders.

And I suggest maybe you've looked at titles not using pixel shaders, or
maybe not understood what it is you're looking at. Static screenshots don't
always distinguish a pixel shader from standard multitexturing.
Does the water in XBox morrowind look as
good as water in PC morrowind with a GF3 or GF4Ti?

Yes.
 
Not on cards with hardware acceleration for them, which is every Nvidia chip
from GF3 and up.


I already answered yes to that. Of course it does, since the XGPU is based
on the GF3.


And I suggest maybe you've looked at titles not using pixel shaders, or
maybe not understood what it is you're looking at. Static screenshots don't
always distinguish a pixel shader from standard multitexturing.

So anyway, I tried out this pixel shading watchmacallit for myself. I got
me a green texta (the sort you use to unlock the copy protection on
CD-ROMs) and tried shading the pixels on my brand new NINETEEN-INCH
TITANIUM MONITOR (it really gets the girls, let me tell you). But it didn't
work out that well. So now I want to know, do any of these cards feature
pixel whiteout as well, or am I screwed? ThaADVANCEnks!
 
"Nowhere near" a GF4MX - meaning that it's _far superior_, you incorrectly
inferred from my post that I meant it was inferior, which was the opposite.

That didn't help.

You said its a mix between GF3 & GF4, considering its actual model #
it has nothing of the GF4(proper) built in... but being that its
SPECIAL for a console, it would have a few added things there to make
JUST for the console... for example, total lack of Desktop 2D &
OpenGL functions.

Because of your sentence structure "nowhere near the GF4mx", I can
guess could go both ways. Adding "lame' or "bad" would have helped
;)
 
DUDE!! I get my CrAzY Pricing from my suppliers, Dealers, Stores,
internet... DUH!

Look at the POSTS from me and others - We are telling people to buy
the LAST of the Ti4200s because they are CHEAPER and twice the horse
power of a 5200. The prices will vary depending on the dealer!

Ti4200 = $70~125
fx5200 = $60~175

When the MX440 was NEW, it was a $125~150 video card, MX460 = $180 and
the Ti4200 was $200~250. At that time, the GF3Ti card was hitting
$125 and quicklly went down.... like the fire sales of older cards
from various stores (COmpUSA selling ATI9600Pros for $130 - when they
were $200 3 weeks ago)
 
So anyway, I tried out this pixel shading watchmacallit for myself. I got
me a green texta (the sort you use to unlock the copy protection on
CD-ROMs) and tried shading the pixels on my brand new NINETEEN-INCH
TITANIUM MONITOR (it really gets the girls, let me tell you). But it didn't
work out that well. So now I want to know, do any of these cards feature
pixel whiteout as well, or am I screwed? ThaADVANCEnks!

Don't quit your day job Hong.

Your standup routine is in need of some serious polishing...
 
You said its a mix between GF3 & GF4, considering its actual model #
it has nothing of the GF4(proper) built in...

The chip's model # is NV2A, which doesn't say anything about its
capabilities. It's just a code name really, not meant to be indicative of
its functionality.
but being that its
SPECIAL for a console, it would have a few added things there to make
JUST for the console... for example, total lack of Desktop 2D &
OpenGL functions.

There are no specific OpenGL functions in any 3D accelerator, just as there
are no "linux" or "windows" instructions in a x86 CPU. I don't know for sure
about the 2D engine, but I would think that is still present as a blitter
always comes in handy for building menus ad such. Besides, it is a small
minority of the silicon die anyway.
 
Don't quit your day job Hong.

Your standup routine is in need of some serious polishing...

I polish my standup routine every day, if you know what I mean, and I think
you do.
 
Back
Top