For the ati expert gamers.

  • Thread starter Thread starter DreamMaker
  • Start date Start date
D

DreamMaker

What the difference when playing in 1600X1200 mode vs the 1024X768
one.

I dont get it. I've try it but i could'nt tell any difference.
 
I dont get it. I've try it but i could'nt tell any difference.

i wish I was that lucky :-)
 
DreamMaker said:
What the difference when playing in 1600X1200 mode vs the 1024X768
one.

I dont get it. I've try it but i could'nt tell any difference.


It's a blessing in disguise if you can't tell the difference.
Admittedly, I'm pretty much the same way. I'm even capable of
"settling" on 800x600 if I have to, but I do notice the difference
between it and 1024x768. Of course, for general desktop use, I love the
higher resolutions, but my monitor is crappy when I go above 1152x864
(or whatever that oddball resolution is).
 
It depends. I think everybody can see the difference if they are using a 24"
monitor. On my 21" I can certainly see the difference between 1280x1024 and
1024x768. I have to run 1/2 my games at 1024x768 because my video card is
wimpy though :(

bye, Rick
 
Depending on the game, I run them anywhere from 1792x1344 to 800x600
depending on what frame rate system can maintain..

The only res change that doesn't make a noticabkle difference is the step
from 1600x1200 to 1792x1344, all others seem obviously better to me.

That's for most games, there's some that using higher res only amounts to
blockier graphics. Seems most prevalent with poorly done PS2 and XBox ports.

I wish I had the horsepower to run hem all at the highest level :(

Note - I have one exception to running at highest res I can. I find Links
2003 a much easier game to play at 800x600 than at higher res as the higher
res makes the swing meter relatively smaller so harder to hit a straight
shot. Granted higher res looks beter but my score suffers :)
 
What is the dot pitch of you monitor. It sounds like your monitor can't
display the higher resolution clearly.
What game, or program did you try.
 
It depends. I think everybody can see the difference if they are using a 24"
monitor. On my 21" I can certainly see the difference between 1280x1024 and
1024x768. I have to run 1/2 my games at 1024x768 because my video card is
wimpy though :(

bye, Rick


Your ok when sayng that your frame rate goes really down at higher
resolution and it the case with my ati card too. But What are the
tehcnical difference between 1600X1200 vs 1024X768. Here my question
tend to be How muc detail will i be able to see. IE: when i look at a
circle. in lower resolution it might be a polygone ( a mulpiple side
2d object) but in higher resolution it might be the circle that
everyone want to see.

And that is why i did buy the big guns of ati, but as i say i can't
see the difference.

Headheck!!!

.... Hmmm, i can see"read" that there are several people in here that
have the same opinion...
 
I think... and therefore this is just an assumption. Increasing the
resolution of the screen only improves the aesthetics of the display and not
necessarily the quality of play (for most games). One example (game) I had
of where you actually had better game play was a game called "Subspace". As
you increased the resolution of the game, you actually had more "playing
area" on the screen at one time. A big advantage for a head to head shooter
game. Like pjp mentioned in the other email, there are some disadvantages to
running at a higher resolution (in some games). I've found several games
that have their status bars at the bottom of the screen, will shrink the
status bar too far to be useful and it's not as easy to notice when your
life or energy runs out.

bye, Rick
 
Your ok when sayng that your frame rate goes really down at higher
resolution and it the case with my ati card too. But What are the
tehcnical difference between 1600X1200 vs 1024X768. Here my question
tend to be How muc detail will i be able to see. IE: when i look at a
circle. in lower resolution it might be a polygone ( a mulpiple side
2d object) but in higher resolution it might be the circle that
everyone want to see.

And that is why i did buy the big guns of ati, but as i say i can't
see the difference.

A circle won't look much different, all that is different is the size
and number of the pixels that make up that circle. FSAA will have more
of an impact to making it appear more circular than just bumping up
the resolution.
 
found several games that have their
status bars at the bottom of the screen, will shrink the status bar
too far to be useful and it's not as easy to notice when your life or
energy runs out.

Yeah. Most of the time I just whap up the FSAA to x4 and stay at 1024x768 for this reason. Much
nicer than 1600x1200, although the purists will disagree, saying I an increasing apparent quality,
not actual quality.... but hey, its only a game :)

the great thing about my last card upgrade (9500 pro >> 9800 pro) is the increase in image quality
and maintaining constant fps. I find this is odd, because most of the marketing and geek sites
concentrate on 'max playable resolution' or 3DMarks (my system gets just under 13000 for the
record... if I spent five hundred quid on new everything, I get the impression I might make 17000,
which doesnt seem a big incentive to upgrading!) in whatever the new big game is, both of which are
irrelevant to many gamers (all effects on at a decent rez seems a better bet for enjoyment).

I guess longevity of the hardware is implied by cards that can run at high resolutions, but it seems
like a false economy whatever way you look at it... I bought the 9800 pro because it is cheap
(because it is about to become obsolete), but I expect not to see it mentioned in the minimum specs
on a game for some time to come!

Also, Im finding that many games just do not need the processing power... many posts imply that my
XP2000+ is underpowered, but overclocking it tends to result in no discernable increase in
responsivness for many games... In fact, the only games that tend to really tax the processor
(lock-on and OFP) dont have a hardcore gamehead following at all, so I worder if its all down to
bragging rights. The last two games I have played (Bloodrayne, Battlefield Vietnam) dont seem to
slowdown at all on my pokey old system... well, BV does if I specify 64 'bots, but its fine for up
to 40. I *did* see slowdowns in some games, but upgrading the memory from 512mb to 1024mb seemed to
kill the problem completely (and btw, memory speed seems to be the biggest con of all...
overclocking my memory by 40% seems to gain me 5-10% in realperformance! Slow memory is already
expected and covered in the hardwareby caching, so why bother with super fast memory?).

S
 
I think... and therefore this is just an assumption. Increasing the
resolution of the screen only improves the aesthetics of the display and not
necessarily the quality of play (for most games). One example (game) I had
of where you actually had better game play was a game called "Subspace". As
you increased the resolution of the game, you actually had more "playing
area" on the screen at one time. A big advantage for a head to head shooter
game. Like pjp mentioned in the other email, there are some disadvantages to
running at a higher resolution (in some games). I've found several games
that have their status bars at the bottom of the screen, will shrink the
status bar too far to be useful and it's not as easy to notice when your
life or energy runs out.

bye, Rick


So higher resolution will benifit only for those particular games.

Plus it as some disavantage like you've sayd..

ok.
thank you every one...
 
DreamMaker said:
Your ok when sayng that your frame rate goes really down at higher
resolution and it the case with my ati card too. But What are the
tehcnical difference between 1600X1200 vs 1024X768. Here my question
tend to be How muc detail will i be able to see. IE: when i look at a
circle. in lower resolution it might be a polygone ( a mulpiple side
2d object) but in higher resolution it might be the circle that
everyone want to see.

The resolution has nothing to do with the detail level of the model. A
square is a square at 640x480 and at 1600x1200. A "circle" which is a 32
sided polygon is still a 32 sided polygon at a higher resolution.

The higher the resolution, the smaller the pixels and the more detail you
can fit in, which means that the pattern or texture on the "circle" might
look better, but the edge of the "circle" will still be made up of the same
straight lines. The straight lines that are not horizontal or vertical will
appear jagged (aliased) - the higher the resolution, the lower this effect.
Additionally you can reduce the aliasing with (wait for it...)
anti-aliasing.
And that is why i did buy the big guns of ati, but as i say i can't
see the difference.

Games with more polygons (nearly everything in a 3d computer world is made
up of a mesh of triangles) will work better on a faster video card than
games with fewer polygons. Turning up the detail levels may have this
effect. Essentially a better video card plays the same games with a higher
frame rate, additionally you may be able to up the detail levels, and
general visual quality (using anti-aliasing, anisotropic filtering, lighting
effects, surface effects and many other techniques).
Headheck!!!

... Hmmm, i can see"read" that there are several people in here that
have the same opinion...


A video card doesn't change the game. The game still needs to have the
higher polygon count for your circles to appear more circular, resolution
alone will not change that. The point is that most games have different
detail levels in the graphics options to account for fast and slow cards.
Try looking for the graphics opions in your games and having a play - you
may find that your circles can be more circular without adversely affecting
your frame rate.

Ben
 
Sham said:
Yeah. Most of the time I just whap up the FSAA to x4 and stay at
1024x768 for this reason. Much nicer than 1600x1200, although the
purists will disagree, saying I an increasing apparent quality, not
actual quality.... but hey, its only a game :)

I tend to play my games at 1600x1200 on a19" Trinitron without AA. To be
honest I haven't played with AA that much, but have found that the best way
to do AA is game dependant. Essentially it is usually worth turning it onto
2x for ATI cards and 4x for GeForce (since 2x on a GeForce has next to no
effect). This helps with edges that appear to creep - usually long straight
edges that are at a low angle (close to horizontal or vertical) and are
moving slowly. AA is essentially a "blur", but I'm under the impression
it's a blur on an oversampled original in order to downsample it to your
actual resolution - in that sense it is better than a simple blur.

There are benefits to both a higher res and AA, I'd say run at the highest
res you can, with 2x AA on - that's gonna be a good starting point.
Also, Im finding that many games just do not need the processing power...
many posts imply that my XP2000+ is underpowered, but overclocking it
tends to result in no discernable increase in responsivness for many
games...

Well that's game dependant!
In fact, the only games that tend to really tax the processor
(lock-on and OFP) dont have a hardcore gamehead following at all, so I
worder if its all down to bragging rights. The last two games I have
played (Bloodrayne, Battlefield Vietnam) dont seem to slowdown at all on
my pokey old system... well, BV does if I specify 64 'bots, but its fine
for up to 40.

Many games (and modern benchmarks) will top out at your CPU, way before your
9800 Pro is the bottleneck. Other games won't. It seems that unless you
have a 3000+ processor, there's little point in upgrading from a 9800 Pro.
I *did* see slowdowns in some games, but upgrading the
memory from 512mb to 1024mb seemed to kill the problem completely
Good.

(and btw, memory speed seems to be the biggest con of all... overclocking my
memory by 40% seems to gain me 5-10% in realperformance!

Did you up your FSB with the memory? If not, how do you expect to get that
extra bandwidth to the CPU? And what exactly is "realperformance" are we
talking in MS Office? Gaming? Video Encoding? Whether or not memory
bandwidth is a concern is highly dependant on your application. Again,
whether or not your CPU is the bottleneck, or your video card, or indeed,
your memory bandwidth is the bottleneck in any situation, depends on that
situation.
Slow memory is
already expected and covered in the hardwareby caching, so why bother
with super fast memory?).


Err... it is offset a bit by caching, not "covered". Well, not unless your
cache is bigger than 512MB, as you have already said that adding more than
that helped. And what exactly IS cache? Fast memory with low latency, and
if that helps, then why not propagate those properties to your system RAM.
Yes it's expensive, but it can, and in many cases does, help.

If you're talking about gaming, then you are chucking a huge amount of data
about, and not all of it can live in the cache.

Ben
 
The resolution has nothing to do with the detail level of the model. A
square is a square at 640x480 and at 1600x1200. A "circle" which is a 32
sided polygon is still a 32 sided polygon at a higher resolution.

The higher the resolution, the smaller the pixels and the more detail you
can fit in, which means that the pattern or texture on the "circle" might
look better, but the edge of the "circle" will still be made up of the same
straight lines. The straight lines that are not horizontal or vertical will
appear jagged (aliased) - the higher the resolution, the lower this effect.
Additionally you can reduce the aliasing with (wait for it...)
anti-aliasing.


Games with more polygons (nearly everything in a 3d computer world is made
up of a mesh of triangles) will work better on a faster video card than
games with fewer polygons. Turning up the detail levels may have this
effect. Essentially a better video card plays the same games with a higher
frame rate, additionally you may be able to up the detail levels, and
general visual quality (using anti-aliasing, anisotropic filtering, lighting
effects, surface effects and many other techniques).



A video card doesn't change the game. The game still needs to have the
higher polygon count for your circles to appear more circular, resolution
alone will not change that. The point is that most games have different
detail levels in the graphics options to account for fast and slow cards.
Try looking for the graphics opions in your games and having a play - you
may find that your circles can be more circular without adversely affecting
your frame rate.

Ben


Dhaa come on i think i know that getting a bigger video card will help
me getting the most out of the game. Maybe i was confuse about how to
put thing around... anyway big sellers like ati or nvidia tend to
promot their vcard, in a way that they posses supposedly new feature
that will make the game more realystic and faster than ever. I felt
for it yep,



Ps: i play all of my game at full detail. 1024x768. as it the most
commun and relliable size of screen that support high resolution
texture. IE: halo, splinter sell 2, and tron 2.

.... to those who want spliter sell 2 . pff the first one is the better
that in clude the addon too. the (2) is just a chain factory made up
video game...
 
What is the dot pitch of you monitor. It sounds like your monitor can't
display the higher resolution clearly.
What game, or program did you try.

hmm my dot pitch is very big .27mm
The monitor is a refurbished pc crt screen.
it a old 50 pound (lol) nec multisync xv17+ monitor
i know that there is good 19" monitor that as .20mm in dot pitch.
but i would rather getting a 17" with a good refresh rate. like 100mhz
at 1024x768. witch is not the case for now with my monitor actualy.

I'm playing with halo, need for speed underground, prince of percia
sand of time, tron 2 (great game with a good 3d engine), splinter
cells 2 pandora, doom 3 beta 2 ( very slow but playable), max payne 2(
that one too as a great 3d engine)

The two game that are well made with the 3d engine are very playable,
had the first splinter cell, with a fx5200td so it not a problem here.
I would say the problem with my quality of graphic is my poor monitor
that tend to decrease my pc game experience and secondly poor
programmers that produce bull shit for a living...
 
Many games (and modern benchmarks) will top out at your CPU, way
before your 9800 Pro is the bottleneck. Other games won't. It seems
that unless you have a 3000+ processor, there's little point in
upgrading from a 9800 Pro.
Yeah, true. My impression is that the monitor refresh rate tops out before both of them :)

If you're talking about gaming, then you are chucking a huge amount
of data about, and not all of it can live in the cache.

Um, no. Thats out of context of what actually happens.

I remember doing all sorts of theoretical calculations in VLSI design back in college years ago (I'm
an electronics graduate, but from the 1990s, so that qualification may be redundant now ;) and the
upshot was that memory speed increases do little to increase overall performance. localized CPU
caching and predictive pipelining mechanisms more or less make really fast main memory redundant.
Im guessing, but it looks like modern PC designs chuck the only exception (gfx texture memory,
noting that vertex memory is relatively memory light and suited to pipelining in a cache) at the gfx
card and away from the CPU (apart from loading data into the gfx card), so its not an issue for
almost any of the time a game is running, assuming adequate gfx memory size.

the only time textures are handled on the mobo is when they are loaded into the video card memory
(as noted above). this can cause a pause in the graphics, but you still get the pause whatever you
do. Doubling the speed of the memory doesnt get rid of it, and I suspect that even PC Express wont
really affect it (it will simply double or treble the transfer rate, which is incremental rather
than anything to write home about - you will still see slowdowns, and will certainly not be able to
use main RAM as gfx memory until we get up to orders of magnitude faster than AGP x8... and early PC
express specs point to less than orders of magnitude; x64 in a year if we are lucky).

I also believe that my increase in memory allowed the system to store texture data in RAM rather
than use virtual memory (ie the hard drive), and this eases reloading the Gfx memory. the change
from hd to RAM is much bigger than any incremental increase in memory speed.

And anyway, technology in general seems to have been a bit of a con. They go on about Moores law
and all that crap but its past 2000, and I still havent got that personal fusion powered jetpack and
cyborg sex slave.

All we have is segway and xbox. no fair!

:))

S
 
Dhaa come on i think i know that getting a bigger video card will help
me getting the most out of the game. Maybe i was confuse about how to
put thing around... anyway big sellers like ati or nvidia tend to
promot their vcard, in a way that they posses supposedly new feature
that will make the game more realystic and faster than ever. I felt
for it yep,

Maybe if you bothered to read some reviews of video card on the net
rather than just reading the marketing hype on the video card box you
would make better informed buying decisions.
 
I tend to play my games at 1600x1200 on a19" Trinitron without AA. To be
honest I haven't played with AA that much, but have found that the best way
to do AA is game dependant. Essentially it is usually worth turning it onto
2x for ATI cards and 4x for GeForce (since 2x on a GeForce has next to no
effect). This helps with edges that appear to creep - usually long straight
edges that are at a low angle (close to horizontal or vertical) and are
moving slowly. AA is essentially a "blur", but I'm under the impression
it's a blur on an oversampled original in order to downsample it to your
actual resolution - in that sense it is better than a simple blur.

personally I like to see things far off, and blurring them doesn't
help. Like sniping in BFV. Besides, I'm usually moving quickly and
worried about getting shot, so admiring the way lines don't stair step
doesn't come into it.
There are benefits to both a higher res and AA, I'd say run at the highest
res you can, with 2x AA on - that's gonna be a good starting point.

Here's the thing. A 19 inch monitor is optimized for 1024 res. This
keeps graphics crisp and you can spot individual pixels better. There
is no shame in playing in this res.
 
Sham said:
Yeah, true. My impression is that the monitor refresh rate tops out
before both of them :)

I'm not sure I'm getting 85fps (I run 1600x1200@85Hz) in most of my games,
but I would find it hard to tell anything above maybe 30.
Um, no. Thats out of context of what actually happens.

I remember doing all sorts of theoretical calculations in VLSI design
back in college years ago (I'm an electronics graduate, but from the
1990s, so that qualification may be redundant now ;) and the upshot was
that memory speed increases do little to increase overall performance.

....thats entirely application dependant. I could write you a program that
is completely CPU limited, or one that is memory bandwidth limited.
Obviously "normal" applications are somewhere in between.
localized CPU caching and predictive pipelining mechanisms more or less
make really fast main memory redundant.

In terms of running many programs, yes, I completely agree. The general
rule is that 90% of the time is spent in 10% of the code. This 10% often
consists of fairly tight loops - you can cache that loop and not have to
access memory until you jump out of the loop. However, if that loop is
basically just shifting data from x to y, then the loop will likely be
memory bandwidth limited, not CPU limited.

Take my CPU for example, FSB of 166MHz, multiplier of 11. So assuming I can
saturate the FSB, thats 64 bits * 2 transfers/s * 167MHz, which is 667
words/s. According to Sandra I can do 7000MIPs, so unless I need to do more
than 10 "instructions" to each word I'm gonna be bandwidth limited. If you
take Sandras memory Bandwidth bench, it says I'm a little under 90% of my
theoretical limit, giving around 11 instructions.

So there you have it, if you need to do more instructions to each piece of
data than your multiplier, you're likely CPU limited, otherwise you're
bandwidth limited. Of course, thats horribly oversimplified as any useful
application will usually depend on more than just one simple loop.
Im guessing, but it looks like
modern PC designs chuck the only exception (gfx texture memory, noting
that vertex memory is relatively memory light and suited to pipelining in
a cache) at the gfx card and away from the CPU (apart from loading data
into the gfx card), so its not an issue for almost any of the time a game
is running, assuming adequate gfx memory size.

the only time textures are handled on the mobo is when they are loaded
into the video card memory (as noted above). this can cause a pause in
the graphics, but you still get the pause whatever you do. Doubling the
speed of the memory doesn't get rid of it, and I suspect that even PC
Express wont really affect it (it will simply double or treble the
transfer rate, which is incremental rather than anything to write home
about - you will still see slowdowns, and will certainly not be able to
use main RAM as gfx memory until we get up to orders of magnitude faster
than AGP x8... and early PC express specs point to less than orders of
magnitude; x64 in a year if we are lucky).

How do you expect to get from one speed to another without getting there
"incrementally". Damn, nobody upgrade any part of their computer, it won't
make any more than an "incremental" difference, which is pointless. Yeah,
ok.
I also believe that my increase in memory allowed the system to store
texture data in RAM rather than use virtual memory (ie the hard drive),
and this eases reloading the Gfx memory. the change from hd to RAM is
much bigger than any incremental increase in memory speed.

Yes, but it's still "incremental", but now it's worthwhile. It's all about
where your bottlenecks are. I'm not saying that "fast" ram (i.e., CAS2) is
nearly two orders of magnitude faster than "slow" ram, but there is a
difference, and it can be useful, in certain tasks.
And anyway, technology in general seems to have been a bit of a con.

Yes, a computer is the worst investment you can make.
They go on about Moores law and all that crap but its past 2000, and I
still havent got that personal fusion powered jetpack and cyborg sex
slave.


I don't think either a jet pack or a cyborg sex slave require masses of
compute power... :-p

Ben
 
Back
Top