J
J. Clarke
Leythos said:I think this about sums it up for us - what you consider quality I
consider average. What I consider as sharp/clear has many levels of being
sharp/clear - meaning that something can be clear and sharp without being
the clearest or sharpest. If you had seen enough monitors/LCD's you would
know that there are acceptable and then ideal levels, there are acceptable
levels of clarity and sharpness and then there are ideal levels of the
same. You seem to think that if an image is sharp and clear that it's not
anything that can be improved upon, which is wrong. The level of clarity
and sharpness at the highest resolution on an LCD panel is going to be
LESS than that of a CRT of comparable quality at the same
resolution/image.
Repeating this over and over again does not make it so.
This isn't something that's wrong with the LCD, it's the
nature of the technology, but it will get better as the years go.
It's the nature of CRT technology that it is physically impossible for it to
produce an image as sharp as that of an LCD running at its native
resolution. You can claim otherwise all you want to but the fact is that
you are just plain wrong.
I've seen many LCD panels that have sharp and clear text/images at their
highest resolution, but the clarity of most average CRT's exceeds that of
those same LCD's.
I see. So now you pull a new term out of your ass, "clarity". Define
"clarity" in numbers.
Sure, there are exceptions, there are some very nice LCD
monitors with very clear/sharp images, but if you compare them to very
nice CRT's the CRT wins every time.
Wrong. The CRT _loses_ every time on sharpness at the design resolution of
the LCD.
The OP asked about Monitor selection for Gaming for $1000, in my
experience, playing Counter Strike, Doom III, and Unreal, over the years,
and with others that I know that have kids that also play those types of
games, a mid-level CRT ($500) would be more than sufficient and better
image quality than any of the $800~$1000 LCD panels.
You've been playing Doom III "over the years"? Do tell.
For gaming there might be some benefit to the CRT. I've said that before.
Has nothing to do with "better image quality" though and everything to do
with whether the user is one of those who sees this "ghosting" that some
gamers claim is such a huge problem and others can't see at all even when
they look for it.
As a second note, if you've got the cash to spend $1000 on a display, it
would be safe to assume that the user is running a very high-end system,
dual CPU's, a very high-end video card, etc... A machine in the $4500
range if build properly
That wouldn't be "build properly", that would be gold plated. A very
thoroughly loaded game machine can be put together for under $4000 using
"best of everything" components and hitting all the buzzwords.
- why else would anyone waste $1000 on a display
for gaming. A cheap monitor, 19", would be more than enough and allow the
extra savings to purchase more memory, faster drives (even SCSI RAID),
So where do you get SCSI RAID for $500 with any real capacity? Used drives
off of ebay maybe. And what leads you to believe that there would be any
performance benefit in a game machine? Games are seldom I/O-bound.
faster or a second CPU,
Where do you get a second CPU with a decent clock speed for $500? Besides,
you already said that he has dual CPUs.
Beyond that, how many games take advantage of dual processors?
more games, higher-speed internet connection...
How fast can one get for a one time expenditure of $500? And why would a
gamer want a "higher-speed internet connection" assuming that he already
has broadband? In gaming it's latency, not bandwidth, that is the limiting
factor.
As for "wasting $1000 for a display for gaming", to the OP it's clearly not
"wasting $1000".