Stupid 'Warming up the lamp" message and delay

  • Thread starter Thread starter richardsfault
  • Start date Start date
R

richardsfault

A major annoynace with using the otherwise generally agreeable HP-3970
is this useless message and delay.

Does a lamp really have to warm up? We're not talking mercury vapor
lighting here!
------------------------------------------------------------------------------

Some people claim that there's a woman to blame, but I think it's all...

Richard's fault!

Visit the Sounds of the cul-de-sac at www.richardsfault.com
 
richardsfault said:
A major annoynace with using the otherwise generally agreeable HP-3970
is this useless message and delay.
It isn't useless - it is there for a reason.
Does a lamp really have to warm up?
Yes.

We're not talking mercury vapor
lighting here!

Correct, you are usually talking about a halogen vapour lamp, typically
Xenon. However, more importantly, you are also talking about accurate
and consistent colour and luminance output from the lamp during the scan
or even from one scan to the next. The first thing the scanner will
usually do is perform a black and white point calibration - but there
wouldn't really be much point in the white calibration if the plasma in
the lamp hadn't uniformly formed across its length and stabilised. If
you have ever watched a standard fluorescent lamp start up, you will
know the sort of thing that happens. Well, even though the lamp in the
scanner is running from a higher frequency supply than just AC mains,
the output still changes for some time after power is first applied. It
might not be noticeable to your eyes which are still adjusting to the
change of brightness themselves, but to the CCD in the scanner the
change can be enormous, and will increase as the lamp ages. That delay
ensures that the lamp will have settled to a uniform light output level,
even at its end of life.
------------------------------------------------------------------------------

Some people claim that there's a woman to blame, but I think it's all...

Richard's fault!

Visit the Sounds of the cul-de-sac at www.richardsfault.com

Richard, your signature delimiter is broken. The standard is two dashes
and a space, not a line of dashes - that way standard compliant software
recognises your signature and cuts it from the response. If you force
everyone to manually edit your posts just to reply then you will get
less response.
 
Kennedy said:
It isn't useless - it is there for a reason.


Yes.

Actually: not really. The usual kathode flourescent lamps do have a more
or less contant ligt output - as long as the power input is constant.
The 'warming up' during the first minutes does not realy make a
difference.
but there
wouldn't really be much point in the white calibration if the plasma in
the lamp hadn't uniformly formed across its length and stabilised.

Which happens in the first milliseconds after switching on. XENON is a
gas and
(assuming normal temperatures) already uniformly distributed.
If
you have ever watched a standard fluorescent lamp start up, you will
know the sort of thing that happens. Well, even though the lamp in the
scanner is running from a higher frequency supply than just AC mains,
the output still changes for some time after power is first applied.

Indeed. And the reason is not the lamp itself, but the power supply.
Usually, it provides a few milliamperes at a high voltage and high
frequency.
The light emitted by the lamp highly depends on the voltage and the
current.
And after switching on the power generator for the lamp, it changes its
output for a certain period of time until all parts have reached a
stable temperature.

This _could_ be handled by using a more complex design and better
components, but why increasing the cost by two if the problem can be
solved without investing anything (the user time doesn't count for the
manufacturer - just like Microsoft)

If you don't believe me (I'm sure you won't, as usual), just cool down a
lamp and you'll not see any change, but cool down its power supply and
you'll see a drastical change in the lamp output.

For the original poster: the delay us useful, and the message is not
useless, it is misleading. But 'lamp warmup' is easier to understand
than 'lamp high-voltage power supply warmup'

Grossibaer
 
Jens-Michael Gross said:
Actually: not really. The usual kathode flourescent lamps do have a more
or less contant ligt output - as long as the power input is constant.
The 'warming up' during the first minutes does not realy make a
difference.

As you say, as long as the power has stabilised - and some other things
too. But "warming up" is a general term for stabilising and the issue
here is that the light output has to stabilise before the calibration
can even begin, let alone the scan proper.
Which happens in the first milliseconds after switching on. XENON is a
gas and
(assuming normal temperatures) already uniformly distributed.
It certainly is evenly distributed but, unfortunately, that also means
it is very convective. It takes several seconds for the ionised gas to
stabilise, which can mean some parts of the lamp length flickering for a
few seconds until it does so. This is why an older lamp, which has
deteriorated the low pressure Xenon with either vacuum leakage or
outgassing of the cathode etc. takes longer to stabilise - nothing
whatsoever to do with the power supply, which can be brand new and
working perfectly with another lamp.
Indeed. And the reason is not the lamp itself, but the power supply.

Not in the normal fluorescent lamp I was using for comparison.
Usually, it provides a few milliamperes at a high voltage and high
frequency.
The light emitted by the lamp highly depends on the voltage and the
current.
And after switching on the power generator for the lamp, it changes its
output for a certain period of time until all parts have reached a
stable temperature.
That is certainly true as well - the lamp is merely one part of the
illumination system.
This _could_ be handled by using a more complex design and better
components, but why increasing the cost by two if the problem can be
solved without investing anything (the user time doesn't count for the
manufacturer - just like Microsoft)
Its a feature - they are giving the user the opportunity to do something
else, such as put the film on the scanner bed or sip some coffee or even
something a little stronger, while the scanner gets ready to do its
thing. The wait is nothing compared to the scan time.
If you don't believe me (I'm sure you won't, as usual), just cool down a
lamp and you'll not see any change, but cool down its power supply and
you'll see a drastical change in the lamp output.
You'll see a dramatic change in the stabilisation time for the lamp if
you just cool that down and leave the power supply at ambient. You will
also find the lamp life increase significantly.
 
Kennedy said:
It certainly is evenly distributed but, unfortunately, that also means
it is very convective. It takes several seconds for the ionised gas to
stabilise, which can mean some parts of the lamp length flickering for a
few seconds until it does so. This is why an older lamp, which has
deteriorated the low pressure Xenon with either vacuum leakage or
outgassing of the cathode etc. takes longer to stabilise - nothing
whatsoever to do with the power supply, which can be brand new and
working perfectly with another lamp.

The flickering in the first few seconds (and the heavy flicker with
dropouts with worn-out lamps) is usually a result of an insufficient
supply voltage. To have a proper ionisation, you'll need a few
kilovolts.
TO make things cheaper (a.g. to run normal neon lamps without a
high-voltage converter at 110V), the 'starter' is injecting free
electrons by heating a wire inside one side of the lamp. This reduces
the required voltage for starting the ionisation process. If the wire is
worn out, it will not emit enough electrons to ensure a proper
functioning and the light will go out again after a short flash. Then
the starter will try again and the next flash appears. If it's enough to
light the lamp, it might be not enough to allow a constant emission.
110V are AC, this measn the voltage breaks down to zero every 1/120
second, then changing polarity (and therefore the direction of the
current flow and hte ionisation), then raising again to 155V etc.

if you have a proper power supply with a stable current of a high-enough
DC-voltage, you'll have a uniformly distributed light emission from the
first moment on. If the lamp loses gas because of ageing, it too doesn't
need to warm up. It's still uniformly distributed in the tube, just less
tight. And requires a higher voltage to provide the same light emission.

This behaviour is obvious if you know the inner physics of a Xenon or
Neon tube.

Anyway, since the power supplies are built as cheap as possible, warm-up
is required. For the user it doesn't make a difference what's the reason
for it.
Not in the normal fluorescent lamp I was using for comparison.

Right - because fo the need of electron injection to make it working
with only 155V peak voltage.
Its a feature - they are giving the user the opportunity to do something
else, such as put the film on the scanner bed or sip some coffee or even
something a little stronger, while the scanner gets ready to do its
thing. The wait is nothing compared to the scan time.

Well, the coffee is the better argument (or not, since I don't drink
coffee ;) )

Usually, one puts the source image into the scanner, clicks on 'scan'
and then has to wait for warmup. (except one has just switched the
scanner on - if the scanner has a switch at all, as most USB scanners
are switched on the moment the scanner software wants to start scanning)
And, indeed, on must today scanners (parallel or USB), scanning is
SLOOOOOOOW compared with some old but quick SCSI scanners.
On my old Agfa the warmup takes much more time than the preview scan (or
even the actual scan).

You'll see a dramatic change in the stabilisation time for the lamp if
you just cool that down and leave the power supply at ambient. You will
also find the lamp life increase significantly.

Well, if you cool it down a significant amount, of course you're
lowering the mobility of the gas (which is the definition of
temperature) and therefore counteract the ionisation and reduce the loss
of gas at the same time.
But how many degrees does the lamp warm up in the warmup phase? It's
called a 'cold light tube' for a reason ;)
Temperature of the tube will increase by 10 or 20 degrees after a longer
time of usage (which is a 5% change) , but not in the warmup phase. And
this is handled by a repeated calibration done after some time in every
good scanner.

Grossibaer
 
Back
Top