Yes, it does.
There are a number of issues, but from what I've seen, #1 is that most
analog cables -- even those that come with the monitor -- are not
impedance matched and introduce ringing ("ghosts") around sharp
transitions. It's most noticeable on small text, it's not noticeable at
all (unless its really bad) on TV or movies or games. It's very subtle,
many people won't notice it, but I'm in the display industry and I see
it quite often.
The #2 problem is time base accuracy and stability -- the analog monitor
doesn't sample the pixel at exacty the moment that the video card
"sends" it. [This used to be the #1 problem, but beweeen monitors
getting bettter and increased display resolution (more 1280x1024 vs.
1024x768), I'd say it's now #2]. You can diagnose this very quickly
(and often adjust to eliminate it) by putting up a test pattern of
alternating black and white vertical bars one single pixel wide; you
should see a perfect reproduction, with zero moire present. The key is
perfect adjustment of the dot clock frequency and phase. Note, however,
that on the majority of monitors, the "Auto" function doesn't produce an
exactly correct adjustment. Close, in many cases, but not exact.
DVI simply eliminates the issues that cause both forms of distortion.
With an analog monitor, maybe it's ok, maybe it's not (and it's usually
not without test pattern adjustment). With DVI, assuming that the
display works, there is no distortion from either cable issues or dot
clock matching to the video card. In those regards, it's perfect.