Don said:
Ah! OK, now it makes sense!
But it does make VueScan look even worse if actually (and, I'm hoping,
inadvertently) it *amplifies* this *residual* noise which under normal
circumstances shouldn't even be visible!
Not really. Vuescan implements a scanner *independent* calibration,
which should be valid with all scanners, storing the results offline in
a separate calibration file. However, if you go into the mathematics of
calibration processes (and I have to admit to over 25 years of
experience of doing exactly this in my professional field!) every
calibration process is a compromise between degradation and improvement.
If you have a perfect calibration to start with then any attempt to
improve on that will degrade the result. The best that you can hope to
achieve is that the degradation is negligible. This effectively comes
down to noise and its spectral (ie. time domain, not colour)
characteristics.
For example, suppose you have a perfect CCD with no dark current
variation between the cells at all - there may still be dark current,
but all the cells have the same amount. The best results would be
produced without *any* calibration at all. As soon as you attempt to
implement a calibration you freeze the random temporal noise which is
present on the device and impose that as a pseudo-residual dark current
noise on the calibrated output. The best you can hope to do is to
reduce the effect of that temporal noise - through averaging many
samples - to a sufficiently low level that its imposition on the output
becomes insignificant. In a real CCD that noise may not need to be
averaged very much to make a significant improvement on the dark current
variability, but on this ideal device it can only degrade it - and the
issue then is how much averaging is needed to ensure that the
degradation is imperceptible.
In another example, you may have a CCD with a very low median noise for
all of the elements which is, say, a factor of 10 better than typical
CCDs however, due to the limitations of the process the out-liers, which
are inevitably present in all devices, are just as poor as those of
typical CCDs. In this case, you need to be 10x more precise in your
handling of noise on the high performance CCD than you are with the
typical CCD, although intuitively you might believe that it would be
easier because it produces more accurate data to begin with.
Looking at Fernando's results, it is clear that he has eliminated the
lines by the process implemented. However he has also increased the
noise - something he noticed by an apparent increase in black level and
loss of colour saturation. In fact, he got rid of the lines because the
subtracted dark current value was different (by the temporal noise on
the CCD) at every pixel in the image - even though the image pixel
resulted from the same CCD cell, a different dark compensation data was
subtracted from it. If he had used the same data, with the same noise
amplitude (eg. by taking one row of that dark scan) for removing the
residual dark variation on every pixel in the image, he would have ended
up with a worse result than he started with. However that is exactly
what a conventional calibration, whether internal on the Nikon/Minolta
hardware or in the Vuescan software, would do - use the same data to
subtract dark level on each pixel produces by a particular CCD cell.
That is why it is important to filter the dark scan, to reduce this
temporal noise adequately.
So, summarising, the algorithm that Ed has developed for typical
scanners, including the Nikons, may simply have met its match with the
performance of the Minolta CCD. It was, for example, the first true
16-bit scanner - which means that more than 16-bit precision will be
required to accumulate the dark reference scan in order to reduce the
spatial manifestations of frozen temporal noise below the quantisation
noise threshold of the raw device.