Ok. What I was asking is how many bits would would be required to allow
for one pixel to store all possibilities fof color. You said that RGB
amplitude info is all there is. So I'm just wondering how many colors
are possible.(Well, actually I should probably find out how many colors
are necessary first. That is how many different colors can be
delineated by the human eye).
That turns out to be a much bigger question than you may
realize. How human vision, and especially color, actually
work is a subject that can (and has, many times) fill a book.
Even a simplified, cursory examination of the subject is a
fair-sized chapter for a book. (One example of such is a
chapter in my own book, "Display Interfaces: Fundamentals
and Standards," if you don't mind the quick plug. Actually,
I wouldn't recommend that anyone buy the book just to get
the answer to this question - it's not worth that. But you
may be able to find a copy on the shelves of your local
college's engineering library. Both the copies that actually
sold had to wind up SOMEWHERE...)
But without going too far into color theory, let me try to
give you a few answers here, The last question you asked,
"how many colors can be delineated by the human eye" is
a complex one to answer, and pretty much everything
I'd wind up saying in a short-form answer would end with
"but it's really not this simple." Let's just say that the eye
can discern millions of different shades - possibly into the
low tens of millions, at least - and leave it at the for now.
So I'm just going to throw out several hopefully relevant
comments. If anyone wants to go into further detail on any
of them, we certainly can do that later.
1. It is impossible for any practical electronic imaging device,
whether it's a display or a printer, to cover the entire
range of colors that the eye can see. So we're really not
even going to worry about that basic question. The
question is really how many bits of information you need to
be able to describe an image (or rather, to allocate to each
pixel of an image) in order to make that image sufficiently
realistic and without artificats such that the eye will accept
it as "realistic". Another word for this might be "photographic"
- we generally accept quality color photos as "looking real,"
so another relevant question is "how many bits per pixel
do I need to make an image look as real as a photograph?"
2. The short form of the answer to the question above is
somewhere around 8-10 bits each of the primaries red,
green, and blue, for electronic displays. That answer actually
is one of those that deserves an "it isn't really that simple"
after it, because for one thing it assumes that all three colors
are equally important (and they're not), and for another
people thinking in terms of "bits per color" will usually make
the assumption that the values use linear encoding, which
generally isn't the best choice here. But we'll just note that
"24-bit color" (RGB at 8 each) is generally assumed to
be "photorealistic" for most casual display work; 10 bits
each will satisfy most of what's left, and 12 bits per primary
will be overkill for all but the most demanding work.
3. Since the original question really had nothing to do with
representing "all the colors we could see" but instead had
to do with storing a digital representation of standard video,
it's probably more important to ask "just how much color
information, in terms of bits/pixel, is actually available in the
best video signal we're ever going to see?" The above
answer turns out to apply pretty well here, too. RGB stored
at 8-10 bits each will generally suffice for the storage of
pretty high quality video (and in fact a lot of digital video
starts out as a 24 bit/pixel RGB representation). However,
video generally is NOT stored this way, but instead takes
advantage of another quirk of human vision to permit essentially
the same perceived quality without storing so many bits.
Human eyes are much better at discerning differences in
"brightness" (luminance) over a small distance than they are
at discerning differences in colors (of the same perceived
"brightness") over that same distance. In more technical terms,
our spatial acuity is better for luminance changes than for
"chrominance" (color) changes. Most digital video systems
take advantage of this by storing image information not in
RGB form, but as separate luminance and color information,
then storing fewer samples (pixels) of the color information
than of the luminance. In other words, they essentially convert
the color image into a luminance-only ("black and white") image
of the same "resolution" (number of pixels), and then add to
that samples of the "chrominance" information which are
effectively shared by a number of adjacent pixels. One popular
digital storage standard encoding, for example, is to store one
sample of color information (these are the signals you will
see referred to as "U" and "V," or in digital terms "Cb" and
"Cr") for every FOUR samples (or pixels) of luminance (Y)
information. If you started out with 24-bit RGB, and used
this sort of encoding for your digital video storage (still at eight
bits each for Y, Cb, and Cr), you'd wind up with
32 bits of Y,
8 bits of Cb, and
8 bits of Cr (for a total of 48 bits)
for every four pixels in the original image. That's a 50%
savings in storage space (or transmission "bandwidth") over
the original 24 bits/pixel version, with very little loss of image
quality for the typical "video" sorts of images.
66:1. It sounds as though that means that I am to multiply that 22
minutes of video by 66, which gives me 1452 minutes(24.2 hours) of
compressed video on a 73G drive.(What am I doing wrong?).
Nothing. If you could fit 22 minutes of uncompressed video in
whatever your original format was onto said 73 gig drive, then yes,
you could fit over 24 hours onto that same drive, using the exact same
sort of compression (and compression ratio) as is used in the example
shown for broadcast HDTV. This is exactly what makes the download
of full-length movies from web sites possible without having to have
God's own REALLY-high-speed network connection. Remember,
though, that we have been making approximations all along here,
so that "24.2 hours" should not be seen as a precise or guaranteed
figure, but rather a ballpark estimate of what you might get given
your original raw data calculation.
Perhaps I should have just asked how much "movie quality video" can fit
on a 73G drive, while exhibiting no noticeable degradation(as a reasult
of whatever compression scheme is used) when shown on a 19-21 inch
display.
If you consider HDTV to be "movie quality video," and given that
broadcast HDTV absolutely CANNOT require more than about
19.2 Mbytes/sec, then we get:
(73 Gbytes)/(19.2 Mbytes/sec) = 3893 seconds, or a little over
one hour of video, just storing what comes over the air exactly
as it's transmitted.
But that's HD; DVD-quality video requires considerably less
space, due to the much fewer pixels/frame, and to match that
quality of what you typically get over-the-air with standard analog
TV (roughly equal to perhaps 450 x 340 pixels per frame, rather
than the 704 x 480 of DVDs) the requirements would be even
lower. So yes, tens of hours of video on a 73-gig drive is not
unrealistic.
As a further example - consider how much space is available on
a standard DVD, then realize that yes, they really DO store
full-length movies on those!
Bob M.