Bart van der Wolf said:
Nothing, seems a bit too strong. Besides, why is there a different gamma
needed for Mac display versus Windows? The LUTs for video display have a
different gamma encoding that needs to be compensated for.
No, it's not too strong. The image encoding has nothing to do with the
display transfer curve/function.
And yes, Macintosh and Windows have different default gamma values -
but that's just for display. The Macintosh was compensating in a way
that made images appear correct and print with minimum change on
certain printing devices. Thus the default Macintosh gamma factor is
1.8, but you can change it. And you can have each display on the
system using a different tone response curve. It's up to the OS or the
application to correct the image for each display.
The question is; is the more efficient usage of bits the result, or the goal
of the exercise?
The goal - it seems to have been created for television (the first real
image encoding system other than paint) to maximize the use of the
available bandwidth.
They also lucked out that the encoding came close to compensating for
the physics of CRTs (reducing the circuitry necessary for early TV).
The reciprocal gamma adjustment applied to the scan data, is reversed by the
native gamma of the display. Where has the perceptual encoding gone?
After going through the whole system of photons back to photons -- it
almost disappears, as it should (it can't completely because the
original scene has higher dynamic range than the viewing system, and
there has to be some tonal compression).
But if you stored the image in gamma 1.0 (linear light), you'd need 2
to 4 more bits per channel to get the same visual quality.
It only adds some accuracy, but has no effect on human perception
(which also looks at a linear gamma real world photon flux).
Flux and perception are very different things.
(which many people have confused)
Based on that photon flux, the
human visual system does respond to luminance differences in a somewhat
logarithmic sense.
Yes - and over a small range, a power function (gamma) is similar to
the logarithmic response.
What's more, also Photoshop does some of its (blending) processing in linear
gamma space, requiring to temporarily reverse the gamma encoding (=loss of
accuracy) of the file.
Yes. And there is no loss of accuracy if you use a higher precision
intermediate value.
Major rendering packages often use linear gamma, because it is a benefit for
calculating, but the storage can be done in a number of different encodings
(=coding efficiency/accuracy).
Yes.
Finally, after all those calculations, the data must be displayed/printed on
a device that has a native gamma, and a compensating correction is needed.
Yes, but it doesn't have to be stored in the file using the same
compensating factor as the display.
Do you really think your LCD display has a gamma similar to a CRT?
Even using linear displays, you're better off using a gamma encoding in
the file format to get the best visual quality for the available bits.
See
http://chriscox.org/gamma/
Chris