There are a number of effects to consider, before selecting a
high resolution on S-Video.
First, the NTSC or PAL standards for baseband, have limited bandwidth.
When a TV station transmits an "analog" signal, the bandwidth is about 6MHz.
I think the video content is set up, to fit within a 4MHz bandwidth,
with perhaps some appropriately sized guard bands on the side.
This affects the video resolution. The DAC on a video card, may have
a stated bandwidth of 400MHz or so (when driving a signal over the VGA
connector), and the resolution that goes with that could be 2048x1576
at 60Hz or some other rather large resolution. So that gives a rough idea,
how the channel bandwidth, may affect the resolution and frame rate.
A 4MHz path, simply can't be expected to pass a resolution like that.
If the TV type signal is band limited, to fit within 4MHz (even when
being sent to computer type equipment), then you can't expect the
equivalent resolution to be that high.
The TV signal uses an interleaved signal. 262.5 lines are painted
on the screen for the odd frame, then the other 262.5 lines are painted
on the screen for the even frame. The actual vertical resolution can't
really be higher than that, as the scanning process (the familiar 15KHz
whine the TV makes), is fixed. So no matter how you set the resolution
on the video card (640x480, 800x600, 1024x768), the interleaved scanning
pattern doesn't change. A TV set is not multisync like a computer monitor
is. It only works at that approx. 15KHz rate.
And then this brings up the issue of aliasing. The video card design
knows it's taking a "higher resolution thing" and stuffing it into
a display format with much lower limitations. On scan converters,
they use convolution (the mathematical process similar to a moving
average), to take 3, 5, or 7 adjacent lines, and use the information
to compute what should show up on a scan line. The video card has
to do something similar (and the video card may be fixed in terms
of its convolution steps - I don't know the details of where in the
video card that is done or how).
If you watch the output of an actual scan converter (we had one at
work), you'll find that horizontal lines look "thin", and the image
may look washed out. And you might blame that on the image processing,
including the convolution, for the result. A scan converter may support
input resolutions as high as 1600x1200, but that's really too high for
line art (computer output) to show properly.
So now, why do we use the higher settings ?
First, depending on the TV set, the TV has the notions of "overscan"
and "underscan". A TV from 1950, might have a signal that scans 30%
past the edge of the screen. Only the center portion of the scan
process on the TV set was "linear", and the image at the edges of the
screen is thrown away. The signal splatters past the edge of the screen.
If you had an LCD TV set, it's perfectly linear, and doesn't need
overscan to work. On the video card, there should be a control for
setting the overscan, so that the image is scaled properly for the
type of TV set. If it's an old old CRT TV set, you would want the
computer S-Video signal to have the overscan enabled.
You select 1024x768, to attempt to squeeze more information on the
screen. You do that, when viewing computer program output, line art
and the like. Setting the scaling of the screen output
("DPI output" in the Display control panel), might also affect
that though. My screen might be set to 120 DPI, in an attempt
to make print larger on my LCD computer monitor. If I cranked down
the DPI setting, it might cause more text characters to be visible
in the TV screen at 800x600.
With the resolution limitations, it's almost impossible to read
text on the TV screen. I've tried to use a TV set as a console
for a Linux box, and it was a miserable experience. I don't consider
it to be a practical solution for that purpose.
However, the human eye is quite tolerant of that sort of thing,
when displaying movie content. Even at 800x600, you can view a
movie, and get most of the information content.
Now, my experience with cheap TV sets is, I can get a better
looking movie output, if I actually connect the computer
output to a channel 3 RF modulator, then go through the RF
path on the TV set. Don't ask how that works! Or how it can
possibly look better. My TV sets don't seem to handle the
color well, over composite or S-Video, and yet if I go through
the antenna input instead, I get a usable image for movie
viewing. This is an example of an "RF modulator", that takes
a TV-type video signal, and sends it on channel 3 or 4. It
seems like the RF modulator has better processing of an
incoming TV signal, than the TV set does. At $29, you
could try this, and see if it looks better. Maybe your
TV sets work better than mine do, and you won't need this
path to get good results.
RF Modulator $29http://
www.radioshack.com/product/index.jsp?productId=2103095
So the only reason for switching to 1024x768, is it "makes
more computer output, icons and text, appear on the screen".
For movie viewing, you might do just as well, while viewing
at 800x600. Given that the scanning process uses a relatively
low resolution, and the "sharpness" of the signal, is even
poorer than that.
On TV sets, people would use things like this image, to
measure the actual sharpness. The idea is to determine
the "just-resolvable lines". And that number is quite
poor on a TV set. But for movie viewing, you aren't
resolving lines, and the human eye makes up for
the actual equivalent display resolution.
http://www.cdr-zone.com/forum/files/sharpness_134.jpg
Try adjusting the overscan/underscan. That should be
enough to get a useful image on the set. And don't expect
text and line art to render on the TV, because the
6MHz channel spacing determined a lot of the characteristics
of such transmission paths.
If you could use "component" output to the TV, such as YPbPr,
that has a much higher bandwidth, and doesn't have the TV-type
limitations. Newer TV sets had things like that on the back.
The connector colors might be RGB, but the signal format
is YPbPr. I doubt an MX400 has "component", but some newer
cards do. The very latest video cards, no longer bother
with the DIN connector on the faceplate, and put more
useless connectors on there instead (like DisplayPort).
http://en.wikipedia.org/wiki/YPbPr
Paul