Rob said:
Pardon the simple question...
Is their any simple rule to relate dpi specs on a scanner (ie on a
35mm negative) to "megapixels" on todays digital cameras?
Not really.
The dpi rating of a scanner tells you how many samples are collected per
inch in each direction. It should really be called ppi for pixels per
inch. If you know the ppi rating and the dimensions of the source, you
can compare that with the pixel dimensions of a digital camera image,
but it would be like comparing apples and oranges.
A scanner scans separately for each color, so you get three independent
records for each pixel. Also, newer scanners tend to scan in 16 bits
per channel. Each pixel gives you 48 bits of information
With one exception, each sensor in a digital camera records only one
color. From that information, the cameras firmware constructs three
separate records, one for each color. But that involves an algorithm,
and it can't create information where none existed to start. Also, I
believe that most current digital cameras still operate in 8 bits per
channel. Each sensor gives you just 8 (or possibly 16) bits.
Also, what size files result (jpg compressed at high quality) from a
3200 dpi or 4000 dpi 35mm scan?
It depends on the quality, the compression algorithm, and the specific
nature of the image. In any case, you shouldn't really edit jpeg
files. They should be converted to tiff or some other lossless format.
Generally, however, the files would be approximately the same size for
the same pixel dimensions of the image.