Firmware hacks for the LS-4000?

  • Thread starter Thread starter fotoobscura
  • Start date Start date
F

fotoobscura

I am looking for information on people that have done firmware hacks on
their Nikon scanners (specifically the 4000 and 8000) in order to
"remove" some built-in "features" that don't impress me.

flames >> dev/null

Cheers,
sd
 
I am looking for information on people that have done firmware hacks on
their Nikon scanners (specifically the 4000 and 8000) in order to
"remove" some built-in "features" that don't impress me.

flames >> dev/null

Cheers,
sd

I'm also looking for similar information in order to turn *on*
features that do impress me but are turned off for marketing reasons
e.g., multiscan on the LS50... ;o)

Unfortunately, not a lot of people work on such a low level.

Since this information is highly proprietary and is not available
outside of Nikon the only option is reverse-engineering which,
perversely, may be illegal in some places.

Don.
 
Don said:
I'm also looking for similar information in order to turn *on*
features that do impress me but are turned off for marketing reasons
e.g., multiscan on the LS50... ;o)
In the case of the LS40, NikonScan does not allow multi scan. Vuescan,
however, does.
Jim
 
Jim said:
In the case of the LS40, NikonScan does not allow multi scan. Vuescan,
however, does.
Jim
If it isn't single pass multiscanning it generally isn't worth having.
 
Kennedy McEwen said:
If it isn't single pass multiscanning it generally isn't worth having.
Vuescan implements multipass scanning. You are right, adding more passes
raises registration issues.
Jim
 
In the case of the LS40, NikonScan does not allow multi scan. Vuescan,
however, does.

No, it doesn't!

When single-pass multiscanning is not available VueScan *multi*-pass
"multiscans" which is a very time consuming way to blur an image...

Don.
 
fotoobscura said:
agreed. as opposed to "sampling"

Of course that also would depend on whether you are going to
down-sample the image after capture, which by itself is already
beneficial to the S/N ratio ...

If the (film)scanner has a somewhat decent mechanical construction,
the misregistration may be about one pixel. Flatbed scanners usually
have larger tolerances. Multiple samples will produce a Gaussian type
of blur, which can be reconstructed up to a point by deconvolution,
something that will even work better after downsampling because the
point spread is smaller. You'd also be surprised how much the MTF
benefits from down-sampling.

Bart
 
Bart van der Wolf said:
Of course that also would depend on whether you are going to
down-sample the image after capture, which by itself is already
beneficial to the S/N ratio ...

If the (film)scanner has a somewhat decent mechanical construction, the
misregistration may be about one pixel. Flatbed scanners usually have
larger tolerances. Multiple samples will produce a Gaussian type of
blur, which can be reconstructed up to a point by deconvolution,
something that will even work better after downsampling because the
point spread is smaller.

It can't work better after downsampling - but since the target objective
is reduced, it may be possible to get closer to that objective than the
original.

Think about it this way - downsamplng is throwing out information. It
can be argued that, given sufficient misaligned samples that the
information lost is small, but it is never zero. Consequently, you are
arguing that a better result can be obtained from less information.

Also, if deconvolution were to work reasonably, USM would also work
pretty well too.
You'd also be surprised how much the MTF benefits from down-sampling.
The MTF doesn't benefit from downsampling Bart, it just has a lower
spatial frequency limit, at which it still has the same (possibly
reduced by the downsampling method) amplitude.
 
Maybe I'm a little confused but I recalled that sampling and doing
"passes" are two different things. Am I wrong here? I seem to recall
having "passes" available on Vuescan with my LS-40 and sampling was
available on my LS-4000 and 8000.

I do 8 pass, sometimes 16 pass sampling with Vuescan and I never see
any blurring..? Even with Long Exposure on which i'd presume would
exasperate that effect. Perhaps its the post-processes the scanner (?)
you mention that repair the mentioned Gaussian blur.

On another topic I haven't done it yet but am wondering if most people
are outputting straight to raw via Vuescan in order to better control
the image post-processing.

Cheers,
sd
 
Maybe I'm a little confused but I recalled that sampling and doing
"passes" are two different things. Am I wrong here? I seem to recall
having "passes" available on Vuescan with my LS-40 and sampling was
available on my LS-4000 and 8000.

I do 8 pass, sometimes 16 pass sampling with Vuescan and I never see
any blurring..? Even with Long Exposure on which i'd presume would
exasperate that effect. Perhaps its the post-processes the scanner (?)
you mention that repair the mentioned Gaussian blur.

On another topic I haven't done it yet but am wondering if most people
are outputting straight to raw via Vuescan in order to better control
the image post-processing.

Cheers,
sd
 
Don said:
No, it doesn't!

When single-pass multiscanning is not available VueScan *multi*-pass
"multiscans" which is a very time consuming way to blur an image...

Not always. There are some scanners which do not offer single-pass
multiscanning using the manufacturer's software but they do in VueScan.
I'm absolutely sure my old Minolta Scan Speed was one of those scanners.
As far as I know, this is also true for the Scan Dual II and scanners
from some other brands
 
fotoobscura said:
Maybe I'm a little confused but I recalled that sampling
and doing "passes" are two different things. Am I wrong
here?

Correct.
There are basically two methods of obtaining multiple samples per film
position. One takes multiple samples of the film with a stationary
scanhead assembly (or stationary film stage), another takes sequential
full image scans (passes) and averages the different scans. The latter
requires accurate re-positioning on exactly the same position, which
will partly fail due to play in the positioning mechanism.
I seem to recall having "passes" available on Vuescan
with my LS-40 and sampling was available on my LS-4000
and 8000.

The correct terminology is multi-sampling (stationary film/lens)
versus multi-scanning (sequential scans/sweeps/passes).
I do 8 pass, sometimes 16 pass sampling with Vuescan
and I never see any blurring..?

That would indicate either multi-sampling, or mechanically very good
multi-scanning. IIRC the LS-40 does the latter. Wear and tear may
cause gradually lower quality registration over time.
Even with Long Exposure on which i'd presume would
exasperate that effect.

Yes. Long exposure in VueScan is implemented as two sequential
scans/passes with different exposure.
Perhaps its the post-processes the scanner (?) you
mention that repair the mentioned Gaussian blur.

I don't think VueScan does much more than blend the two scans with a
threshold.
On another topic I haven't done it yet but am wondering if
most people are outputting straight to raw via Vuescan in
order to better control the image post-processing.

I think only a small number from the huge number of users are actively
doing post-processing based on Raw / linear gamma output. That is
usually triggered by specific needs that cannot be done otherwise.

Bart
 
Maybe I'm a little confused but I recalled that sampling and doing
"passes" are two different things. Am I wrong here?

It is multiscanning using two distinct processes - obtaining multiple
samples in a single pass using , where the scanner has that capability,
and obtaining multiple samples in separate passes where it doesn't. The
intention is the same - to reduce the noise (not necessarily the
increase the scan density) by averaging several samples of each point.
Obviously, if the samples do not perfectly align, whether in a single or
multiple pass, then the image will blur to the extent of the
misalignment. That misalignment can be zero on a single pass scanner,
but relies on the mechanical precision and stability of the unit on a
multiple pass system.
I seem to recall
having "passes" available on Vuescan with my LS-40 and sampling was
available on my LS-4000 and 8000.
Yes.

I do 8 pass, sometimes 16 pass sampling with Vuescan and I never see
any blurring..?

Whether you see it and whether it is there are two completely different
things.

The LS-40 will misalign separate passes, but usually the amplitude of
the misalignment is less than a single pixel. It can be a lot worse
than this though, and depends on many variables, including the starting
temperature of the unit which can, for example, cause expansion of the
head drive worm over several optical resolution cycles.

With a good multipass scan, with sub-pixel misregistration, you will
only see the effect if you are scanning at the full optical resolution
of the unit - 2900ppi. Even then, you will need to have detail of that
resolution present on the original film for its resolution to be
degraded in the first place. Then there is the question of whether you
can see that and distinguish it from the improved noise level - in other
words, understand the effect that you are observing. Grain loss from
misalignment can look very deceptively like reduced noise - and it is
only when you notice the lost resolution that the price of that
improvement comes into perspective.
Even with Long Exposure on which i'd presume would
exasperate that effect.

Not necessarily.
Perhaps its the post-processes the scanner (?)
you mention that repair the mentioned Gaussian blur.
No, there is no post processing performed in any of the multiscanning
operations. Bart was alluding to processing that you could apply to
reduce the effect. The mathematics of deconvolution works against you
since you would actually end up with a worse noise level after
deconvolution than you had in the single non-multipass scan if you were
to completely compensate for the gaussian blur. If you are lucky - and
that is a big IF - there might be some interim stage where some
advantage of both processes result in an overall improvement.
 
Kennedy McEwen said:
It can't work better after downsampling - but since the target
objective is reduced, it may be possible to get closer to that
objective than the original.

That's what I meant with 'better'. Deconvolving at the original size
will only emphasize the lack of alignment. With the additional loss of
accuracy introduced by down-sampling (note the "up to a point"), there
is a larger chance of restoring a 'meaningful' signal (with less
registration error), al be it at a reduced resolution (which would be
a non-issue if the image were to be reduced anyway, e.g. for
web-publishing).
Think about it this way - downsamplng is throwing out information.
It can be argued that, given sufficient misaligned samples that the
information lost is small, but it is never zero. Consequently, you
are arguing that a better result can be obtained from less
information.

No, that is not what I meant, not in absolute terms anyway, because
resolution is sacrificed. The improvement is in a better looking
image, better in less obvious registration errors.

As a side note, there are solutions where several dithered samples can
deliver higher resolution than the individual undersampled images can
provide. You may be familiar with a process called "drizzling", used
in astronomy (http://xxx.lanl.gov/abs/astro-ph/9808087).
Also, if deconvolution were to work reasonably, USM would also work
pretty well too.

Yes, but again after losing absolute resolution in the downsampling
process, of which I'm fully aware.
There are also more advanced methods than USM (which tends to enhance
noise).
Adaptive Richardson-Lucy restoration
(http://www.mathworks.com/access/helpdesk/help/toolbox/images/deblurr7.html)
is quite effective in my experience, but it's not a miracle cure (you
can't restore what's *really* lost).
There are also other methods being developed
(http://vela.astro.ulg.ac.be/themes/dataproc/deconv/articles/deconv/deconv.html).

The MTF doesn't benefit from downsampling Bart, it just has a lower
spatial frequency limit,

Lower spatial frequency limit, of course.
at which it still has the same (possibly reduced by the downsampling
method) amplitude.

Time for a little experiment.
First image is the MTF of a (0.5 pixel radius) Gaussian blurred
slanted edge with 4% (PS uniform) noise added,
<http://www.xs4all.nl/~bvdwolf/temp/GB05_WhiteNoise4pct_Y_cpp.png>.
Second image is the MTF of the same target image, downsampled with
ImageMagick's Sinc filter, to 50% in both dimensions,
<http://www.xs4all.nl/~bvdwolf/temp/GB05_WhiteNoise4pct_50pctIM_Y_cpp.png>.
The MTF50 (the black line), for example, improved from 0.246 cy/px to
0.393 cy/px. So despite the loss of resolution, the modulation
improved significantly. The aliasing tendency for the highest spatial
frequencies also reduced in this case, although it will increase with
more significant downsampling.
The benefit for the S/N ratio should also be obvious (2.4x better
S/N):
http://www.xs4all.nl/~bvdwolf/temp/Noise_before_and_after_Downsampling.png

This is all under the assumption that we don't want to enlarge the
image, and a 50% reduction will still allow a decent output size.
Scanning at 5400 ppi and downsampling to 50% will still allow 8.5x12.8
inch (21.6x32.4 cm) output at 300ppi without interpolation.

Bart
 
fotoobscura said:
Maybe I'm a little confused but I recalled that sampling and doing
"passes" are two different things. Am I wrong here? I seem to recall
having "passes" available on Vuescan with my LS-40 and sampling was
available on my LS-4000 and 8000.

I do 8 pass, sometimes 16 pass sampling with Vuescan and I never see
any blurring..? Even with Long Exposure on which i'd presume would
exasperate that effect. Perhaps its the post-processes the scanner (?)
you mention that repair the mentioned Gaussian blur.

On another topic I haven't done it yet but am wondering if most people
are outputting straight to raw via Vuescan in order to better control
the image post-processing.

Cheers,
sd

Hello

If anyone is interested in downloading a large file, I have a demo .PSD
that shows how good the LS30,40 and 50 family of scanners are in
accuracy of scan. The scans were done in Nikonscan without closing the
twain
You can also see what effect 9 samples has in terms of noise.
You can turn of the samples with ALT clicking ths base image. Double
click a sample to see the blending and opacity.
The area is a crop from a 2700 dpi scan

http://www.btinternet.com/~mike.engles/mike/Multiscan.zip

Mike Engles
 
Bart van der Wolf said:
That's what I meant with 'better'.

A somewhat misleading term - akin to claiming better service by reducing
the completion time for jobs by not doing them as well. ;-)

If you follow that logic then you can downsample the image to 1x1 pixels
and get a perfect result because it is impossible to get closer to the
limit for that resolution!
Deconvolving at the original size will only emphasize the lack of
alignment.

Not at all. As you stated, when sufficient randomly misaligned images
are averaged the effect is the same as a gaussian blur. Deconvolution
with the appropriate gaussian will, in theory, perfectly recover the
original image free from any blur. In practice, unless a huge number of
frames is so averaged, the noise introduced by the deconvolution process
will exceed the noise reduction of the averaging in the first place, so
you don't actually win anything - which is what I understood by your
comment "up to a point". However, deconvolution at full resolution
certainly doesn't emphasise the lack of alignment, it eliminates it. In
fact, there is some argument for applying the process after up-sampling,
so that the deconvolution filter of the sub-pixel misalignment can be
more accurately estimated.
With the additional loss of accuracy introduced by down-sampling (note
the "up to a point"), there is a larger chance of restoring a
'meaningful' signal (with less registration error)

Only because you have reduced the sampling density to the point where
the error is insignificant in the first place!
No, that is not what I meant, not in absolute terms anyway, because
resolution is sacrificed. The improvement is in a better looking image,
better in less obvious registration errors.
The improvement *is* that the registration errors are less significant
at the lower resolution because the information that describes that
misregistration is thrown away in the downsampling.
As a side note, there are solutions where several dithered samples can
deliver higher resolution than the individual undersampled images can
provide. You may be familiar with a process called "drizzling", used in
astronomy (http://xxx.lanl.gov/abs/astro-ph/9808087).
After a brief overview of it, I am surprised that managed to get past a
peer review! The technique described appears to be no more than one
variation of a very common technique which has been practised in the
imaging industry, and actually implemented on the HST that they
reference, for many years and known as microscan.

In many systems small but controlled misregistrations are deliberately
introduced into the subsampled image so that a supersampled image can be
composed from a sequence of frames. In other systems, the
misregistration is permitted to occur randomly but is measured, and thus
the supersampled composite can be produced. Production of the
supersampled image in such cases has used the variable reconstruction
pixel in every case I have ever seen where the image displacement is
uncontrolled.

Either way, there is nothing relevant in that paper nor in the process
to what is going on here with misregistered multisampling, because there
is no method of determining the misregistration *prior* to the averaging
being computed in the case of multipass multisampling scans.
Yes, but again after losing absolute resolution in the downsampling
process, of which I'm fully aware.

Not necessarily. You don't *need* to downsample - all that is doing is
throwing resolution information away for the benefit of SNR. In fact,
if deconvolution worked, and it would if you had enough passes, then you
would actually benefit from upsampling for the reasons provided above.
There are also more advanced methods than USM (which tends to enhance
noise).

There are lots, but and they would all work just as well, and in some
cases better, than deconvolving with a gaussian. Deconvolution is an
exceptionally noisy process, primarily due to the inevitable presence of
zeros, or near zeros, in the filter matrix.
Lower spatial frequency limit, of course.


Time for a little experiment.

As with sharpening filters there are many downsampling algorithms each
producing a different MTF WITHIN THE PASS BAND OF THE DOWNSAMPLED
OUTPUT! Of course, some will have the effect of boosting MTF within
that pass band compared to the original - even simple filters can do
that. NONE, and certainly none of the examples you have provided, can
retain let alone enhance data which exceeds the pass band of the output.
Consequently, if useful information lies in that region - and it should
because otherwise why bother multisampling a high resolution image in
the first place? - it is lost. Nothing can recover that. Yes, you will
be closer to the limit of what can practically be contained in the
downsampled resolution, but so what? The whole point of multisampling
is to increase the signal to noise ratio without losing resolution. If
you are prepared to downsample then use a low pass filtering algorithm
on a single pass and get the benefit of noise reduction from that. Half
the image size and double the SNR with the appropriate algorithm.
This is all under the assumption that we don't want to enlarge the
image, and a 50% reduction will still allow a decent output size.

But what if it won't? I don't think we have stated anywhere that we are
working to such assumptions. If we were, then there are far easier ways
of getting the required results without multisampling. For example, a
lower noise 800x600 image perfectly suitable for web display can be
achieved from a single pass scan at 4000ppi by suitable downsampling
than can be achieved by 16x multisampling if the full resolution is
retained. In fact, you would need about 40x multisampling to achieve
the same SNR in the full resolution image.
Scanning at 5400 ppi and downsampling to 50% will still allow 8.5x12.8
inch (21.6x32.4 cm) output at 300ppi without interpolation.
Bart, have you read the title of the thread? Perhaps you have
information of a soon to be re-released LS-4000 that samples at 5400ppi?
You may have overstepped the terms of your NDA with Nikon if you have
such information. ;-)
 
SNIP
This is all under the assumption that we don't want to enlarge the
image, and a 50% reduction will still allow a decent output size.

And even if we'd consider a direct comparison at equal size, the loss
is less than a 50% loss due to the better S/N ratio and sharpening
potential:
<http://www.xs4all.nl/~bvdwolf/temp/50pct_IPRL.png>
If I would tweak the PSF a bit more, the result after restoration
could be slightly improved, and the S/N still better.

Bart
 
Back
Top