Re: "I agree with you, if a camcorder, or a $50 dollar DVD player can
output perfect signals to ordinary TV set, so should a $200 video card
be able to."
Not true, and this response suggests a lack of understanding of the
processes and technologies involved.
The camcorder IS A TV DEVICE. All of the signals that it uses ARE TV
signals. TV resolution, tv aspect ratio, tv scan (refresh) frequencies.
It was designed to produce and capture a TV picture, and that's all
that it does.
Conversely, a computer display is NOT a TV device, and you are
converting a signal into something totally, and I meant totally,
different (and, in this case, inferior).
Consider for example:
Computer horizontal resolution = 800 to 1280 pixels
TV horizontal resolution = 240 to 450 pixels
Computer vertical resolution = 600 to 1024 pixels
TV vertical resolution = 480 INTERLACED pixels
(note the implied total pixel count:
TV = 115,200 to 216,000 pixels
Computer = 480,000 to 1,310,720 pixels]
Computer horizontal scan rate = 38 to 70 KHz
TV horizontal resolution = 15,764 Hz (15.764 KHz)
When you try to display a computer screen on a TV set, you are trying to
stuff "10 pounds of xxxx into a 2 pound bag". It should be obvious that
trying to display a 1,310,720 pixel image on a 115,200 pixel display
isn't going to look very good. But even that's not the entire
"picture". You are trying to do it via (typically) a composite or
S-Video interface, that "merges" red, blue, green, luminance, horizontal
sync and vertical sync into a single signal (or two for S-video), in a
way totally foreign to all computer interfaces, both analog and digital,
and using a technology that seriously degrades and compromises all of
the individual components.