Games - 16 versus 32 bit, diff's???

  • Thread starter Thread starter pjp
  • Start date Start date
P

pjp

I've always run my desktop at 16 bit. I also got in the habit of running
games at 16 bit to up fps/res on my old PIII and a Radeon 64DDR VIVO (7200)

Now have a P4, TI4800SE and the TI's "shader engine" is obvious. However,
using the TI, I've tried various games at both 16 and 32 bit and fail (so
far) to notice any discernable difference in any special effects etc! Should
there be "things" I should be specifically looking for?

I've tested using NFS:U, Doom3, Richard Burns Rally, Painkiller, MOA-PA and
a couple other "pretty obvious" graphic intensive games.

Mind you I'm 55 and my eyes aren't as good anymore :(
 
There are typically two things to look for:

1. Banding - 16-bit color usually allocates 4 bits for each primary color +
4 bits for alpha. So you only have 2^4=16 levels of each color, and only 16
levels of transparency. On large, smooth-shaded objects, and in sky
textures, there's often much "banding", or abrupt color gradient
transitions.
2. Dithering - some drivers attempt to simulate more than 16 levels of each
color by dithering, so you end up with grainy textures and a "screen door"
effect on transparent effects like smoke and fog.

If you cannot see a difference, just play in 16-bit color. :-) In any case,
modern video cards are powerful enough that 32-bit color usually doesn't see
a performance drop.
 
I've tested using NFS:U, Doom3, Richard Burns Rally, Painkiller, MOA-PA and
a couple other "pretty obvious" graphic intensive games.

Mind you I'm 55 and my eyes aren't as good anymore :(

Lucky you! Like First of One said there should be a noticeable difference
with shadows, smoke and in sky-boxes. But this depends on each individual
person. Like the issue if more as 30 fps are needed for a computer game - in
my opinion YES, I can see and feel the difference between 30 and 60 fps. But
I am only 41... :-)

If you see no difference, don't bother. Sooner or later you will step over a
game that NEEDS the 32 bit setting. The last game that I remember was
"Morrowind" but since then I always had my desktop at 32 bit.

Cheers,
Chris
 
First of One said:
There are typically two things to look for:

1. Banding - 16-bit color usually allocates 4 bits for each primary color +
4 bits for alpha. So you only have 2^4=16 levels of each color, and only 16
levels of transparency. On large, smooth-shaded objects, and in sky
textures, there's often much "banding", or abrupt color gradient
transitions.

Probably why MS Rallisport Challenge's sky looks the way it does, will check
32 bit see if any diff.
2. Dithering - some drivers attempt to simulate more than 16 levels of each
color by dithering, so you end up with grainy textures and a "screen door"
effect on transparent effects like smoke and fog.

Can't say I've ever noticed this.
If you cannot see a difference, just play in 16-bit color. :-) In any case,
modern video cards are powerful enough that 32-bit color usually doesn't see
a performance drop.

Probably best to change habit and run 32 bit. as newer system seems to not
mind and no noticable performance hit I can tell.

Thanks for the info.
 
Christian Atteneder said:
Lucky you! Like First of One said there should be a noticeable difference
with shadows, smoke and in sky-boxes. But this depends on each individual
person. Like the issue if more as 30 fps are needed for a computer game - in
my opinion YES, I can see and feel the difference between 30 and 60 fps. But
I am only 41... :-)

If you see no difference, don't bother. Sooner or later you will step over a
game that NEEDS the 32 bit setting. The last game that I remember was
"Morrowind" but since then I always had my desktop at 32 bit.

Seems to me a couple I run don't even allow bit depth change, e.g. always 32
bit. I do agree more fps the better (least up to monitor's refresh rate).

Thanks for the info.
 
I don't know what people here are saying,the difference between 16bit
(65,000 colors) and 32bit (16.7 million colors is huge), from pictures to
banding to gradient color shading can easily be noticed.

Also the TI4800 has no shader engine what so ever, thats a DirectX9 effect,
the ti4800 is only 100% DirectX8 (not even 8.1) capable.

While its a good card the visual differnces and capabilty to a true directx9
card is huge and visable.
 
Rick said:
I don't know what people here are saying,the difference between 16bit
(65,000 colors) and 32bit (16.7 million colors is huge), from pictures to
banding to gradient color shading can easily be noticed.

I think so too and people who claim otherwise must be blind.
 
I see where you're comming from pjp.
I noticed just the same thing on my other ti4400 based rig.Therefor i played
most at 16bit.
The simple fast is that certain games which run at a certain pace just don't
make you notice the difference.
Ofcourse no one here is blind,the differences are there in many games.
Just not so noticable in some other games.And if playing at 16bit gives the
same satisfaction then by all means.
Ow,and for the record,the gf4 cards do suffer some performance impact when
playing at 32bit.
 
Back
Top