Scanning Film

  • Thread starter Thread starter measekite
  • Start date Start date
Wayne Fulton said:

Jeez Wayne, didn't I just tell you that the advertising data is often
wrong? If you really want to check, download the darned user manual and
read what it says there at the top of Page 93 (under Specifications) and
throughout the book.

For that matter, download a copy of the archaic NS-1 software (still
available on the Nikon site as v1.63) and see what that used in both
application and manual. It isn't new usage - it has been that way since
I first used Nikon products.
That is simply how the real world actually is (I'd guess 90%). Whether you
like it or not is not the issue.

As I have said before, it is not a matter of whether I like it or not,
it is a matter of whether it is correct. Your argument is completely
circular along the lines of
"99% of the population drink contaminated water and that causes disease.
We should teach them they should only drink clean water - and explain
that those less fortunate still drink from contaminated supplies.
But 99% of the population drink contaminated water so we must continue
to use contaminated supplies."

In this case, the problem is regular confusion in users between dots and
pixels, and one of the causes is the common misuse of the term dpi
instead of ppi. Nobody has disputed that dpi is a commonly used term,
and it is often used correctly. However your circular argument boils
down to encouraging its misuse on the justification that the misuse
exists. If you (in particular!) made an effort to use the correct terms
in your publications and we (as a group) encouraged their use whilst
discouraging misuse then the problem would soon be eradicated and the
percentages you quote would inevitably change to reflect the situation.

Needless to say, if Public Health Authorities used your logic, our water
supplies would not have improved at all in the past 2 millennia and
cholera would regularly break out in all major population centres.
Beginners certainly need to have this usage
explained, because they are going to see it everywhere.

As I said, a better method of doing this is to use the correct
terminology whilst explaining that it is often misrepresented by
numpties using the wrong terms.
Telling them that it
can only mean something it cant possibly mean (in such context) is
counterproductive to their understanding. That's not good.
I don't think I have suggested at any time in this thread telling anyone
that a term "can only mean something that it cannot possibly mean".
Quite the contrary - I suggest the confusion would be greatly reduced if
people used the terms that meant exactly what they do mean, rather than
expecting others to pick up what is actually meant by the misuse of
another term, often from knowledge that they do not yet possess!
 
Wayne Fulton said:
(e-mail address removed)
says...

Kennedy, checkout
http://www.nikonusa.com/template.php?cat=1&grp=98&productNr=9238
for just one of many. I could post a jillion links, but you already
know it. This is pretty much universal usage. The definition of
dpi in such usage is "pixels per inch", if about images. Always has
been, years and years.

On the "Key features" they (erroneously) mention "4000 dpi true
optical resolution" while at the "Tech specs" tab they mention
"Optical resolution: Up
to 4,000 pixels per inch". So it rather proofs Kennedy's point; The
industry gives off (inexcusable) mixed messages, often wrong ...

There's no excuse for adding to the spread of misinformation, so let's
call things by their (proper) name. Dots are only appropriate when
describing ink spots/droplets on an output medium like in print, and
there are usually multiple dots placed per pixel in an attempt to
create intermediate (ink) colors.

Bart
 
If that's all they know, they're still in trouble. What is actually needed is
a few more words to give a little understanding about how things work.

No. What is needed is to tech them the correct terminology and why it
is wrong to conflate dpi with ppi.
We are
still more likely to see image resolution referred to as dpi than ppi, so we
need to know to expect that, and what it means. If we understand, then
there's no problem.

Just because people do it wrong, doesn't mean we should replicate
their mistakes.

What happens when you use incorrect terminology? Well, look for
example at the NASA space probe where they designed in inches and the
Europeans designed in cm....
 
Sorry about my bad threading here Kennedy, it's the best I can do this time.
My ISP's newserver has stopped receiving new messages again, even my own, but
it still seems to post. This is in reply to your next following message in
this thread, the one with the bizarre contaminated water "analogy". If you
like analogies, Esperanto perhaps had advantages too, but it didnt really
change anything. I think it also suffered from resistance to change.

It is simply indisputable that dpi is the general term in widest use, for
example 90% of the 2+ million google links that also mention the word image.
Just look, it seems bvious proof of the real world. Some photo editor
software is a recent exception (but some of it says dpi too). Same with
scanning software, some exists both ways. Those users with only one program
ought to get out more. :) All the published scanner specifications prefer to
state their ratings as the long accepted dpi term (OK, except for the current
Nikon brand models). Even you yourself always spoke in terms of scanning at
X dpi, up until you changed your mind a couple of years ago (according to
Google groups). That's your privilege, but it didnt affect the rest of the
world as much as you might presume.

So while ppi is sometimes used recently, the obvious real world is that the
term dpi has an order of magnitude greater general usage than ppi, simply
because that is the long established definition, time honored. I dont know if
that will last, but it is true today, and for all of history. I know you
know this, so I assume you just enjoy getting in for the fight. It gets
weary to me. Perhaps you just envision a better world, what ought to be, but
it is also good to recognize what is now. Anyway, we are only repeating
ourselves now, there seems little point of this.

Your argument incorrectly assumes I advocate use of dpi. But I really dont
care which is used, and it is not my place to decide it anyway. Dpi does have
the right sound to me, just because it has always been the correct term, but
I mentioned that my book tries to use ppi throughout, as ppi does have
advantages. That seems unbiased to me. But either is fine to me personally.

Instead my concern is about beginners encountering the rest of the world's
usage. Since two terms do exist with the same meaning, with dpi greatly more
prevalent than ppi (like it or not), and since dpi has a second different
usage too, then it is in fact a serious issue, and I do think beginners ought
to be advised about what the real world is like, and about what they can
expect to see, and most importantly, will be expected to understand. We may
use any term we prefer, but we must understand it both ways. Beginners really
need to know about dpi, because it exists with wide lead. It is easy to
understand according to context, but if we are told wrongly that dpi can only
mean ink dots, that really leaves the real world as a bewildering place.
Hiding heads in sand and denying wont fix any problem. We simply need to
explain the real world, as it actually exists, accurately.
 
For someone who puts out "scantips" (and charges for a book), don't you
think you bear the responsibility to clarify the terminology instead of
propagating the confusion? Speaking of propagating confusion, shouldn't
you try to explain the difference between a scanner's hw and sw, and
what raw scanning is?

It seems like your "scantips" are intended for the masses who are
willing to accept the terminology confusions and you are willing to lead
them down the wrong path.
 
Wayne Fulton wrote:
....
Instead my concern is about beginners encountering the rest of the world's
usage. Since two terms do exist with the same meaning, with dpi greatly more
prevalent than ppi (like it or not), and since dpi has a second different
usage too, then it is in fact a serious issue, and I do think beginners ought
to be advised about what the real world is like, and about what they can
expect to see, and most importantly, will be expected to understand. We may
use any term we prefer, but we must understand it both ways. Beginners really
need to know about dpi, because it exists with wide lead. It is easy to
understand according to context, but if we are told wrongly that dpi can only
mean ink dots, that really leaves the real world as a bewildering place.
Hiding heads in sand and denying wont fix any problem. We simply need to
explain the real world, as it actually exists, accurately.

But what happens when these beginners start meeting the correct
terminology? Why can't they be taught what is correct, and that many
people get it wrong? The fact that 99% get it wrong does not make it right.

A quote from Epson:
"Image resolution refers to the number of pixels per unit of measure in
the digital image, commonly expressed in pixels per inch (ppi.) This
should not be confused with dots per inch (dpi), which is a measurement
of output resolution on a printer."

Now why can't they all learn that?

Steve
 
Wayne Fulton said:
Sorry about my bad threading here Kennedy, it's the best I can do this time.
My ISP's newserver has stopped receiving new messages again, even my own, but
it still seems to post. This is in reply to your next following message in
this thread, the one with the bizarre contaminated water "analogy". If you
like analogies, Esperanto perhaps had advantages too, but it didnt really
change anything. I think it also suffered from resistance to change.
Esperanto meant learning something that was unnecessary. Learning the
difference between dots and pixels *is* necessary.
It is simply indisputable that dpi is the general term in widest use, for
example 90% of the 2+ million google links that also mention the word image.
Just look, it seems bvious proof of the real world.

I can't believe that you don't understand that this statistic is
irrelevant to your case and recycles your argument up its own back end,
but I am slowly being convinced that this may indeed be the only
explanation by your continued repetition of reference to it.
Even you yourself always spoke in terms of scanning at
X dpi, up until you changed your mind a couple of years ago (according to
Google groups).

As I mentioned earlier, you'll find even more recent archives where I
have used the term in that context - I don't recall claiming that I was
perfect. However I regularly used the terms pixel per inch and samples
per inch for many years before I ever ventured onto Usenet. I won't
presume to say that I used them before Usenet existed, but certainly
long before I ever had the facilities to access it - and that is a long
time ago.
I know you
know this, so I assume you just enjoy getting in for the fight. It gets
weary to me. Perhaps you just envision a better world, what ought to be, but
it is also good to recognize what is now.

I am not in it just for the fight, but it is true that perhaps I do
envisage a better world - and one way of achieving it, albeit slowly, is
not to encourage the use of the wrong terms. Whilst searching Google
archives you might have noted how long ago it was that I suggested
scanner manufacturers stopped claiming any figure for resolution
(whether ppi or dpi) that was clearly well beyond the true resolution
capability of the scanner. How many scanners on the market today
actually achieve what they advertise? You can count the number of
flatbeds in that category on the fingers of one hand - and that could
have had a serious accident with a machete!
Your argument incorrectly assumes I advocate use of dpi. But I really dont
care which is used, and it is not my place to decide it anyway.

That is exactly the problem - it is your place, just as it is mine (and
everyone else who is aware of the distinction) to use and encourage the
use of the correct terminology to avoid confusion in new users (and some
who are not so new). That way you won't be repeating the same old
replies to confused users and more people will truly understand the
issues involved in both scanning and printing.
Dpi does have
the right sound to me

Fresh salmon has the right sound to me - until someone tags an "ella" on
the end. Dpi, in its place, is perfectly correct but not when it has
"scanner" tagged to the front of it. ;-)
 
Wayne said:
Dpi does have
the right sound to me, just because it has always been the correct term, but
I mentioned that my book tries to use ppi throughout, as ppi does have
advantages. That seems unbiased to me. But either is fine to me personally.

Instead my concern is about beginners encountering the rest of the world's
usage. Since two terms do exist with the same meaning, with dpi greatly more
prevalent than ppi (like it or not), and since dpi has a second different
usage too, then it is in fact a serious issue, and I do think beginners ought
to be advised about what the real world is like

Not if the real world is confusing, IMO. Just hoping you will give in
because of the large number of protests, here in this newsgroup, against
your decision to support and spread the use of the term dpi where ppi
would explain a lot (especially to newbies), I'm adding my protest to
this long list of replies:-)

I think that even for a novice it is very easy to understand the
difference between a dot and a pixel. Of course you always have to add
that many salesmen use the term dpi where ppi should be used, but that's
not difficult to grasp either. If we would have to adapt ourselves to
the language use of salesmen, then we might just as well compare
scanners by referring to their interpolated resolutions;-)
Hey, I think I should have bought one of those cheap 19,200 dpi flatbed
scanners. The specs are far better than any dedicated filmscanner, and
it can also replace my inkjet printer!

Anyway, Wayne, since all over the world people are referring to your web
pages and your book, I think you should have second thoughts on what to
teach ...
 
Toby said:
AFAIK it's common knowledge that dpi and ppi are two different animals, and
that using dpi when you mean ppi is a mistake, albeit a common one. The term
dpi originated when halftone screens produced discrete dots on the output
media.

Often true, particularly perhaps for scanners where "dots" may not
make much sense. However, even for printers, they *may* mean the
same thing such as with a dyesub printer where dots are also image
pixels. I still have my dyesub (although replaced in usage last
year by a new Canon i9900).

Dots per inch also is often used for screen resolutions where
within common usage (esp. for monochrome screens) they mean the
same thing (or equivalent things) as well.

Mike
 
AFAIK it's common knowledge that dpi and ppi are two different animals, and
that using dpi when you mean ppi is a mistake, albeit a common one. The term
dpi originated when halftone screens produced discrete dots on the output
media. In inkjet printers it has come to mean those tiny little squirts of
ink. As Kennedy says they do not correlate with any exact pixel, and there
may be quite a few dots per given pixel, which is why you would not be very
happy with a printer with 300 dpi resolution, but you are quite happy with a
300 ppi scan.

I used the search term "ppi vs dpi" and got 3510 hits...

Toby
 
Just hoping you will give in
because of the large number of protests, here in this newsgroup, against
your decision to support and spread the use of the term dpi where ppi
would explain a lot (especially to newbies), I'm adding my protest to
this long list of replies:-)


Sorry guys, I really cant solve your problem. It is much larger than me.

As to giving in, I did update my book 2.5 years ago to change dpi to ppi
everywhere appropriate. Seemed like a good thing to do, there are advantages.
But I'm not going to also lie about it, and deny what is, because we must
understand current usage too. The more we know and understand, the better.
 
Back
Top