Benjamin Gawert said:
That would come as a surprise to the people (including myself)
who worked on the DVI spec. DVI does NOT include a parity
bit, checksum, or other error-detection/correction functionality
within the data stream, nor is there any provision for error handling
on the link. In short, should the signal degrade to the point where
errors are being introduced into the data (but the receivers can still
maintain synchronization and data decoding), there is no way for
the receiver to detect this.
However, it should be noted that display applications are generally
pretty forgiving when it comes to the occasional error, whether that
comes in via analog OR digital encoding of the video.
"Bit flipping" as you call it doesn't happen over cables. What happens,
though is that there are various types of influences (crosstalk, wave
effects, reflections, irradiation etc) that are effective in cables (and
any other type of signal transmission). One of the advantages of digital
transmission over analog transmission however is that digital
transmission is way more robust to these influences than analog
transmissions.
Yes and no. As I've said here before, there is really NO difference
between "analog" and "digital" when it comes to robustness in the presence
of noise IF THESE ARE COMPARED AT COMPARABLE DATA
RATES. There's simply no way to get around the Gospel According To
St. Shannon, which sets an absolute and unavoidable limit on the amount
of useful information you can get through any physical channel in the
presence
of a given amount of noise. What most people mean by a "digital"
transmission - wherein a given physical line carries a serial stream of
amplitude-encoded binary - is simply an example of trading off data
rate for noise margin, and that's all it is.
Since unlike with analog
transmission the data integrity remains constant over a certain area of
noise level the image quality also remains constant, no matter if the
cable is say 1m or 5m or if there is a KVM in the line or not.
However, additional "noise" (meaning anything in the signal which
is not "signal," including all forms of noise, distortion, etc.) is
unavoidable for longer cable runs. You WILL get a higher error
rate with longer cables, all else being equal, and that's unavoidable.
Your very basic thinking of degradation because of "flipping bits"
doesn't really fit to DVI because the TMDS signalling used in PanelLink
communications (the technology that is used in DVI) is more than the
plain transmission of a few bits. PanelLink uses quite complex data
words with ecc schemes which makes it even more robust.
Nope. PanelLink(TM) uses a proprietary 8-to-10-bit encoding
scheme whose primary functions are to DC balance the line and
minimize the number of transitions (and therefore signal-induced
noise) in the resulting data stream. It most definitely does NOT
add anything in the way of robustness in terms of the error rate.
Bob M.