D
David Maynard
Conor said:Its what I found on Google.
Ah. Well, that the encoding resolution of a DVD but they're sampling above
what NTSC can actually do, and then there's the matter of what a TV can do.
NTSC channel bandwidth is 6Mhz and the duration of a horizontal line is
63.556 microsec. However, that includes sync with the potentially visible
portion being 52.66 microsec (I say potentially because TVs overscan to
mask alignment errors so you don't see black bands on the screen edge).
Now, a "Hertz" is a swing from one polarity to the other and back so, if we
make a loose translation to 'pixels', that corresponds to 2 pixels, one
light and one dark, then the next cycle could be light and dark again, etc.
(in TV terms they talk about "lines" [one being light and the other dark]
of resolution because that takes into account horizontal scan jitter [it
won't be a vertical line if they don't line up all the way down the screen]).
Dividing it out, roughly 12 million pixels per second, max, over the 52.66
microsec visible portion is about 632 pixels and the number usually gets
rounded to 640.
A similar consideration of the vertical blanking interval gives a vertical
of 480-486 lines (pixels) out of the 525 total.
That's interlaced, however, and TVs have a difficult time getting every
line to interlace 'perfectly' in-between the previous frame's lines so the
visual effect is usually less than the 480. But, for the moment, let's say
it's perfect so the theoretical capability is 640x480, in MONOCHROME, I.E.
black and white.
Color (difference) information is quadrature modulated on a 3.58 MHz
subcarrier that's placed inside the monochrome signal and the strange
frequencies, such as 60Hz vertical not being 60Hz but rather 59.94, was
done so that the color subcarrier information looks like perpetually
shifting 'random noise' in the B&W picture and, as a result, it becomes
'invisible' to the eye because it doesn't persist in any location long
enough for the eye to notice it.
However, the bandwidth of the subcarrier color information is extremely
limited and nowhere even close to 640x480. It only appears reasonable
because the human eye is much more sensitive to intensity variations than
it is to color change and if you get up close to a TV picture you will see
that color 'smears' all over the place. It just looks decent, at normal
viewing distances, because your eye keys off the B&W content. I.E. if Billy
Bob is wearing a red jacket against a blue sky your eye tells you the red
of his jacket ends where the blue sky begins because it can 'see' the B&W
intensity change between his dark jacket and the bright sky but it you put
your eyeball up to the screen you'll see the red and blue terribly smeared
across the line of demarcation.
So, saying 640x480 resolution is wholly inappropriate when talking about a
COLOR picture.
Now, to get to what a TV can do we need to note one other thing, the sound
information is placed on a 4.5MHz carrier in the 6MHz video bandwidth. The
point being, we've got what the NTSC signal is *made* of but the TV has to
untangle all that stuff back out of it.
Unfortunately, the idea of the color chroma subcarrier looking like 'noise'
on a B&W set doesn't work on a color one because that 'noise' IS the color
information and would interact directly with the decoded version of it on
the screen. The 'simple' way, which early TV receivers used because complex
electronics was expensive, is to simply say to hell with it and roll off
the monochrome signal at 3MHz, under the 3.58MHz subcarrier, and feed the
upper part to the color decode circuitry. That, however, cuts the
monochrome bandwidth in half to 320 pixels per line.
The fancy way is to use some kind of complex filtering, now that
electronics is 'cheap' and we can pop complex ICs all over the place, to
'comb' out the 3.58MHz subcarrier, leaving in the rest. And that's what's
typically used now days, a "comb filter." It's not perfect either but it's
a hell of a lot better than rolling it off at 3MHz. And then we've got to
get the 4.5MHz sound of of it too.
The point is, you're going to loose some resolution off the theoretical
maximum, which isn't all that great to begin with, no matter what mechanism
you use to decode it.
I.E. NTSC, and TVs, stink as computer monitors.
So what can one 'gain' with a PC and a tuner card? Not much in the way of
the tuner because it's got to decode that lousy broadcast NTSC signal. A
'better' signal just ain't there.
What you *can* gain gets back to your 720x480 DVD encode. Assuming it's
encoded better than 'broadcast' you have better video than could come
through an NTSC signal. So now the problem is, what to display it on?
A computer monitor would be great as it has much better resolution than an
NTSC TV and the signal in the PC it can go straight to it uncorrupted.
The other alternative is to feed it to a TV/monitor with better resolution
than a 'normal' TV using an interface that is better than NTSC, like
component video. (Of course, a standalone DVD player could do the same thing)
You will even gain a slight amount, for DVD/VCD, using S-video, or even
composite video, because it doesn't have to go through the RF
modulation/demodulation phases but it's going to suffer from color chroma
subcarrier modulation and stripping just like broadcast video because
that's how NTSC works. Where you can potentially gain is if they don't
strictly limit to a 6Mhz bandwidth because you're on wire and don't have to
meet FCC channel assignments.
But there's not much you can do with a tuner card because it's dealing with
the same signal a TV tuner is. Unless you get into post processing filters
but, even then, you can't get more resolution than is in the signal to
begin with.