we know 'exposure' means letting special amount of light hitting on
the film ( paper or ccd ), but what does 'exposure' means to a
scanner? i think i have to get know what is exposure before
understand 'auto exposure' in a scanner.
if 'auto exposure' here in scanner really similar to 'auto exposure'
on digital camera, so scanner have to adjust the amount of light it
emitted from the its light source. is it true or 'auto exposure' only
a 'level adjustment' game?
If I try to get technical I'll just make lots of big mistakes, but I'm
pretty sure CSM1 is the one to listen to in this thread. It's fully
possible for a scanner to make hardware adjustments to light strength,
sensor gain, and exposure time. How exactly this is done is no doubt
different between models, but the goal is the same as with cameras,
getting a good midrange exposure with as little highlight blowout as
possible. My sense is that consumer-oriented scanners tend toward
fully auto hardware exposure, just like point-and-shoot cameras. My
Epson 4870 has no obvious analog gain controls in its software. It's
been said in this group that there *are* discrete lightsource
steppings in place, and that Vuescan can be used to invoke them
manually. My film scanner has 3 gain controls in its driver, one per
RGB channel. They have a concrete effect on the exposure quality of
the initial output even if no software adjustments are applied.
It's also common to have an "auto" button in the driver that applies
purely software adjustments to the initial scanner output in a "level
adjustment" game, as you put it. It's even possible to have a mix;
someone wrote here that adjusting the Epson driver's white and black
points in prescan has the potential to bump the scanner to a more
appropriate light strength during the final scan. How exactly this is
implemented again depends on the scanner in question. If there's no
discrete gain control or "lock exposure" option in the driver, you'll
have to try Vuescan for hardware control or accept some amount of
software adjustment in your driver's output.
You should be able to test your driver's hardware behavior by scanning
different-toned images. Try something that's mostly white, then try
an all-black page. Use a high-res setting to see if there's a
measurable difference in scan times. See whether the data *prior* to
level adjustment is distributed as far to one side of the histogram as
you'd expect with a fixed exposure. You can also try blocking your
scanner's calibration area with something darkish prior to powerup to
see if it causes long scan times and overexposures. Or try other
things I can't think of.
Whatever else, if you turn off everything "auto" and limit your driver
interaction to setting an appropriate white point (note that I mean
setting the "brightest" WP in the luminance histogram, not sampling a
WP to balance colors), you should get as close to an unadjusted scan
as you're likely to find useful.
false_dmitrii