Quote from Don:
You haven't provided *any* objective data (measurements, analysis, etc)
to support this assertion. Therefore, it's a subjective feeling. You're
entitled to that, or course, but that has no relevance whatsoever to an
objective evaluation regarding quality.
I think what you're still failing to grasp is the difference between a
subjective preference ("I like it" or "looks good to me") and
objective fact (statistics, measurements, data analysis, etc.)
{end quote}
And Don hasn't provided any evidence to suggest that using objective
criteria is preferable to using subjective comparisons when it comes to
evaluating the quality prints! Do you use a ruler to compare
paintings? Microscopes?
What objective tests have you done of VS against other software
programs in a controlled setting?
The most important thing is whether the subjective conclusions drawn
are informed by facts, evidence and testing, the quality of the
criteria uses, and the willingness of the person making the conclusion
to fairly evaluate the evidence. My conclusions are based on my own
testing. I am not an expert in the technical side of photography but
can appreciate differences in the quality of images.
I have extensively tested Vuescan with my slides, negatives, B&W negs,
and with and without a scanhancer to evaluate graininiess. I've also
tested prints done with VS against my digial mini-lab, and see no
quality improvement in letting them do the scanning and processing.
Same goes against my friend's 20D- it's good to do tests like these
from time to time to keep pushing to get the best possible scans and
prints.
I participated in this year's scanner bakeoff and my results came out
fairly well for this model of hardware. This test was a mixture of
subjective and objective criteria done with an identical slide and
judged by a third party. After looking at the other entries and
getting a better monitor, I have since improved my post-processing of
the scan.
http://www.jamesphotography.ca/bakeoff2005/results.html
The prior Scanner Bake-off (which I did not participate in) was a
weaker test in that it was limited to objective criteria (MTF, CA...)
which only give you a partial understanding of scan quality if the goal
is to figure out which scanner/software combos can produce the best
*results*.
http://www.jamesphotography.ca/bakeoff2004/scanner_test_results.html
As you'll see Vuescan driven scanners did perfectly well.
It is clear to me that the more meaningful test is a *subjective* one
which looks at results from different platforms and not simply
objective measurements, if the question is "can I get good prints using
Vuescan?"
So for recent testing:
This weekend I went back and tested Filmget's cleaning (Canon TWAIN
driver) of a problem negative which wasn't dusty but seemed to have
pits in it. While Filmget's cleaning generally looked better than VS
(no VS-esque blurry blobs, and more spots were cleaned), it had some
very strange overly smooth artifacts with repeating patterns near the
corners of the picture which looked like digital camera noise
reduction. I'm happy to post crops if you are curious.
This *subjective* comparision was with the same frame under identical
conditions with Filmget tested with IR cleaning off and on to verify
cleaning was the cause. I was evaluating the picture at 100% of a
4000dpi scan. VS didn't clean many of the pits in the negative but
didn't leave behind any odd smoothed areas either, so in my judgment
this is preferable as I can clone out white spots but can't do much if
the apparent grain structure is changed throughout the image. Someone
else might subjectively prefer a clean center even given artifacts in
the corners. Also, Filmget set an overly aggressive black point which
had to be corrected in Photoshop.
This is my subjective opinion informed by evidence and analysis. It is
an opinion, which I'd argue is preferable to unanalyzed sets of
"objective" data. Data analysis is subjective, not objective, FYI, as
expert interpretations of the same data can and will lead to different
conclusions. See NOAA vs Virginia Tech scientists and others about
hurricanes and global warming, for example.
http://www.usatoday.com/weather/climate/2005-09-15-globalwarming-hurricanes_x.htm
Anyway, results with Vuescan are not universal given the range of
scanners being more (and less) compatible, so any blanket assertions
about its scan quality are unsupportable.