bxf said:
I keep on seeing posts with comments that imply that such-and-such
(monitor, card, whatever) is better because it can display at a higher
resolution, etc.
I can accept that there are situations where the higher resolutions,
which result in smaller content on the screen, are an advantage in that
one is then able to fit more on the screen, and this is definitely
useful at times (but this has nothing to do with quality). But other
than that, what is the big deal about higher resolutions?
You've hit on a very good point, but to cover it adequately I'm
first going to have to (once again) clarify exactly what we mean
by the often-misused word "resolution."
In the proper usage of the word (and, by the way, how you
most often see it used with respect to such things as printers
and scanners), "resolution" is that spec which tells you how
much detail you can resolve per unit distance - in other
words, if we're really talking about "resolution," you should
be seeing numbers like "dots per inch" or "pixels per visual
degree" or some such. Simply having more pixels is not always
a good thing - you have to first be able to actually resolve them
on the display in question (not generally a problem for fixed-
format displays such as LCDs, if run in their native mode) AND
you need to be able to resolve them visually. That last bit
means that the number of pixels you really need depends on
how big the display (or more correctly, the image itself) will
be, and how far away you'll be when you're viewing it.
The human eye can resolve up to about 50 or 60 cycles
per visual degree - meaning for each degree of angular
distance as measured from the viewing point, you can't
distinguish more than about 100-120 pixels (assuming
those pixels are being used to present parallel black-and
-white lines, which would make for 50-60 line pairs or
"cycles"). Actually, human vision isn't quite this good
under many circumstances (and is definitely not this good
in terms of color, as opposed to just black-and-white
details), but assuming that you can see details down to
a level of about one cycle per minute of angle is often used
as a rule-of-thumb limit.
This says that to see how much resolution you need, and
therefore how many pixels in the image, you figure the
display size, what visual angle that appears to be within
the visual field at the desired distance, and apply this
limit. Let's say you have a 27" TV that you're watching
from 8 feet away. A 27" TV presents an image that's
about 15.5" tall, and if you're 8 feet (96 inches) away,
then the visual angle this represents is:
2 x inv. tan (7.75/96) = 9.2 degrees
At the 60 cycles/degree limit, you can therefore visually
resolve not more than about 576 line pairs, or 1152
pixels. Anything more than this would be wasted, and
even this, again, should be viewed as an upper limit -
your "color resolution" (the spatial acuity of the eye in
terms of color differences) is nowhere near this good.
In terms of pixel formats, then, an image using
the standard 1280 x 1024 format would be just about as
good as you'd ever need to be at this size and distance.
Note that a 15.5" image height is also what you get from
roughly a 32" 16:9 screen, so the HDTV standard
1920 x 1080 format is just about ideal for that size and
distance (and an 8' distance may be a little close for
a lot of TV viewing).
However, this again is the absolute upper limit imposed by
vision. A more reasonable, practical goal, in terms of
creating an image that appears to be "high resolution" (and
beyond which we start to see diminishing returns in terms of
added pixels) is about half the 60 cycles/degree figure, or
somewhere around 30. This means that for the above-mentioned
27" TV at 8', the standard 480- or 576-line TV formats,
IF fully resolved (which many TV sets do not do), are actually
pretty good matches to the "practical" goal, and the higher-
resolution HDTV formats probably don't make a lot of
sense until you're dealing with larger screens.
At typical desktop monitor sizes and distances, of course,
you can resolve a much greater number of pixels; from perhaps
2' or so from the screen, you might want up to about 300
pixels per inch before you'd say that you really couldn't use
any more. That's comfortably beyond the capability of most
current displays (which are right around 100-120 ppi), but
again, this is the absolute upper limit. Shooting for around
150-200 ppi is probably a very reasonable goal in terms of
how much resolution we could actually use in practice on
most desktop displays. More than this, and it simply won't
be worth the cost and complexity of adding the extra pixels.
This leads me to wonder about the following: is there any difference
between viewing an image/DVD at a resolution of a x b, and viewing the
same image at a higher resolution and magnifying it using the
application's zoom software so that the size is now the same as that
under a x b?
No, no difference. In terms of resolution (in the proper sense
per the above, pixels per inch) the two are absolutely identical.
Bob M.