titus12 said:
I am about to buy a LCD monitor. Is 5ms video response time OK for today's
games? Can you still get the same HD video quality on a CRT monitor with
one of the latest video cards with dual DVI connectors with use of an
adapter?
Thank you;
David
There is a primer here on LCD monitors.
http://www.xbitlabs.com/articles/other/display/lcd-parameters.html
"You could have read in old reviews of LCD monitors (and in my reviews,
too, as I have to confess) that as soon as their response time (the real
response time as opposed to the specified value which, if measured
according to ISO 13406-2, is not in fact indicative of the real speed)
was lowered to 2-4 milliseconds, we would forget about it altogether
just because its further decrease wouldn’t make anything better –
we wouldn’t see the fuzziness anyway."
What he is saying, is the speed measurement technique currently used
for LCD monitors, is not indicative of their real speed. The real speed
is probably a lot slower than the quoted number (like maybe 25ms).
Your options for judging monitors, is to visit a big box electronics store,
and look at the monitor you want to buy. Or if the monitor is sold on
Newegg.com, they have customer reviews, and if the monitor is deficient
in design, someone will soon point it out.
There is some background here on DVI.
http://en.wikipedia.org/wiki/Dvi
If a video card has DVI-I connectors (and most of them are of that type),
there are both analog and digital signals on the connector. By using a
"VGA dongle", it is possible to extract the analog signals and put them
into the familiar 15 pin VGA connector format. In the picture here,
for example, two "VGA dongles" are included with this dual DVI-I video
card. That would allow two VGA monitors to be connected (CRT or LCD).
[Note that not all LCD monitors have VGA inputs any more, which I
consider dangerous. If an LCD has only DVI-D on it, then you cannot
use it with an old, VGA-only computer.]
http://images10.newegg.com/NeweggImage/productimage/14-130-092-06.jpg
Modern video cards have a DAC output bandwidth of 400MHz, and claimed
output resolution in analog mode of 2048x1536 or so (quoting from memory).
You can get the same quality if:
1) DAC really does have the stated bandwidth (i.e. chip design was not
secretly defective or some manufacturing problem cropped up).
2) Interference filters placed on video card, near the connector, have
not compromised the bandwidth and made the picture fuzzy. In past
years, some people used to fix this by removing the interference
filters
3) Connector scheme (dongles, PCB connector design) is not causing
reflections in the 75 ohm signal transmission environment. In other
words, for perfect signal transmission in analog, the wires and connectors
have to be of good design. The best way to do this, is with video devices
that have coaxial interfaces (like five BNC connectors). VGA connectors
are a compromise in that respect, at the best of times.
4) Length and quality of cable. You won't be able to recognize a
2048x1536 picture, at the end of 100 feet of cable. Even if the
cable was "low loss". At that res, the cable would have to be short.
DVI has the advantage, that cable imperfections are not all immediately
visible. Up to the point that the signal fails to be recognized, the
picture will be perfect. When the signal is about to drop out due to
transmission quality issues (bad DVI cable or too long DVI cable or for
certain cards, bad DVI driver design), you will see colored snow on the
screen. So in that sense DVI is superior, as in most situations the
available signal is enough for a perfect picture.
So, I would say that "analog works as good as it ever did". The
fact that it comes through a DVI-I connector, needs a DVI-I to VGA
dongle, will not affect the issues that have always plagued VGA.
You'd have the same problems using a 2048x1536 CRT five years ago,
as you would now. DAC bandwidth of 400MHz has been available for
quite some time.
Paul