Kennedy said:
It certainly is evenly distributed but, unfortunately, that also means
it is very convective. It takes several seconds for the ionised gas to
stabilise, which can mean some parts of the lamp length flickering for a
few seconds until it does so. This is why an older lamp, which has
deteriorated the low pressure Xenon with either vacuum leakage or
outgassing of the cathode etc. takes longer to stabilise - nothing
whatsoever to do with the power supply, which can be brand new and
working perfectly with another lamp.
The flickering in the first few seconds (and the heavy flicker with
dropouts with worn-out lamps) is usually a result of an insufficient
supply voltage. To have a proper ionisation, you'll need a few
kilovolts.
TO make things cheaper (a.g. to run normal neon lamps without a
high-voltage converter at 110V), the 'starter' is injecting free
electrons by heating a wire inside one side of the lamp. This reduces
the required voltage for starting the ionisation process. If the wire is
worn out, it will not emit enough electrons to ensure a proper
functioning and the light will go out again after a short flash. Then
the starter will try again and the next flash appears. If it's enough to
light the lamp, it might be not enough to allow a constant emission.
110V are AC, this measn the voltage breaks down to zero every 1/120
second, then changing polarity (and therefore the direction of the
current flow and hte ionisation), then raising again to 155V etc.
if you have a proper power supply with a stable current of a high-enough
DC-voltage, you'll have a uniformly distributed light emission from the
first moment on. If the lamp loses gas because of ageing, it too doesn't
need to warm up. It's still uniformly distributed in the tube, just less
tight. And requires a higher voltage to provide the same light emission.
This behaviour is obvious if you know the inner physics of a Xenon or
Neon tube.
Anyway, since the power supplies are built as cheap as possible, warm-up
is required. For the user it doesn't make a difference what's the reason
for it.
Not in the normal fluorescent lamp I was using for comparison.
Right - because fo the need of electron injection to make it working
with only 155V peak voltage.
Its a feature - they are giving the user the opportunity to do something
else, such as put the film on the scanner bed or sip some coffee or even
something a little stronger, while the scanner gets ready to do its
thing. The wait is nothing compared to the scan time.
Well, the coffee is the better argument (or not, since I don't drink
coffee
)
Usually, one puts the source image into the scanner, clicks on 'scan'
and then has to wait for warmup. (except one has just switched the
scanner on - if the scanner has a switch at all, as most USB scanners
are switched on the moment the scanner software wants to start scanning)
And, indeed, on must today scanners (parallel or USB), scanning is
SLOOOOOOOW compared with some old but quick SCSI scanners.
On my old Agfa the warmup takes much more time than the preview scan (or
even the actual scan).
You'll see a dramatic change in the stabilisation time for the lamp if
you just cool that down and leave the power supply at ambient. You will
also find the lamp life increase significantly.
Well, if you cool it down a significant amount, of course you're
lowering the mobility of the gas (which is the definition of
temperature) and therefore counteract the ionisation and reduce the loss
of gas at the same time.
But how many degrees does the lamp warm up in the warmup phase? It's
called a 'cold light tube' for a reason
Temperature of the tube will increase by 10 or 20 degrees after a longer
time of usage (which is a 5% change) , but not in the warmup phase. And
this is handled by a repeated calibration done after some time in every
good scanner.
Grossibaer