F
Frantisek.Rysanek
Dear Everyone,
I know that this has been discussed here before, but the key
discussions took place some 3-4 years ago. I'm glad to know that other
people got the idea before me, and some of the answers were actually
informative - but I have further questions to ask...
I've been watching the developments in LCD's and other displays for
some time, and I may have to dump my old 17" CRT someday soon. I'm
wondering what to get next. The LCD's seem to be moving to ever higher
resolutions, but largely at the cost of pixel size. If you want bigger
pixel size, there are two practical ways to get that: a consumer-grade
LCD TV, or a "public display" LCD for an even higher price (with
similar parameters to an LCD TV).
The key question perhaps is about the right pixel size.
I know that a number of technology visionaires have been foaming about
hi-res displays and "digital paper" for a few years. Maybe three years
ago I've first seen a 15" notebook display at 1600x1200. I know that
Windows can be set to use larger system fonts or icon sizes. So what
am I fussing about?
Firstly, a number of software programs tend to use fixed-size fonts
and icons. Those are based on proportions from the days of 800x600 or
1024x768 - i.e. about 72 dpi, or a metric pixel size of about 0.35mm.
Frankly, I don't see much need for higher resolutions - those
approximately 72 dpi were just about right to get all the needed
information on screen and to watch such a screen on your desktop. The
CRT's electron ray was focused for just about this dot size. Your
distance from the display hasn't changed over the years, so making the
pixel size smaller doesn't make much sense - it doesn't help you on
your job, be it databases/accounting, coding, bitmap or vector
graphics. Based on several years of practical experience and use-cases
(with my human colleagues serving as test dummies), all I can say is
this: if you get a higher-resolution display, you end up gazing at it
from a closer distance, to be able to see the details you need to see.
I've seen several geeks shine childish joy over the packaging when
their new hi-res LCD arrived, then gradually turn their face to a
picture of despair when they needed to get some work done...
Hell, I don't want to be bending over forward, to gaze at my screen
from a 30 cm distance! I much prefer to lean back in my office chair
and make myself comfy!
Sometimes I have to work with my office PC for long hours - the eyes
get weary, the body needs maximum possible sitting comfort, still your
"balance maintaining system" tends to get dizzy... The best thing you
can do is get the sharpest possible display that you can watch from
just the right distance.
Now I'm getting to another point that I'm somewhat unsure about.
Nowadays I still have a 17" CRT. I used to have several such monitors
before. At just the right resolution (about 75 dpi), the pixels tend
to get somewhat hazy. I can't see the sharp corners anymore, because
the CRT smears them (the electron ray gets too thick). What's worse,
after a day's work, my eyes tend to give up focusing on the pixel
corners, and just focus enough to see what's left by the CRT. I know
that my eyes can do better. But they just give up. At the end of the
shift you get up and walk out into the street, and find out that you
have trouble getting your eyes to focus fully again... A good night's
sleep helps, a week of vaccations helps even more... but it makes me
wonder if a better display could alleviate the problem.
Does anyone have medical-oriented references related to this issue?
What's better: hazy CRT pixels, tiny LCD pixels with bigger fonts, or
bigger but precisely sharp cornered pixels that you can tightly focus
on? "Too big and visibly pixelated", but sharp? Which is best for your
eyesight?
I don't consider myself "visually handicapped", I don't wear glasses -
maybe an ophtalmologist would tell me otherwise if I paid a visit for
the first time in my life I don't believe that glasses would help
me anyway. I know that my wife's eyesight used to be worse than mine
(she wore some thin glasses), before she got her eyes trimmed by laser
surgery - nowadays her eyesight is better than mine. The "computer
display weariness" makes my eyesight noticeably worse, as does runny
nose etc.
Back to technical topics:
How sharp are the LCD pixels anyway? Can you always see the three RGB
segments, at a close look? Or just a sharp square pixel with some
homogeneous color, in a thin separating grid maybe?
I know that LCD's connected via analog VGA (RGB) input tend to have
the pixels slightly smeared (should be within one pixel's dimension),
or even different RGB components slightly shifted against each other.
All this should be avoided by a proper digital interconnect - be it
TTL, LVDS or the DVI-style signaling (including VESA DFP and HDMI).
So far I've seen a single LCD TV connected to a PC - unfortunately the
only option at that time was a VGA interconnect, and that input on the
TV was *horribly* smeared in the horizontal dimension - essentially a
low-pass filter on the scan line's luminance signal. The smear was
maybe 4 pixels wide. Apalling experience. But again, I guess proper
DVI handles all that.
DVI can be converted to HDMI with a mere conversion cable (pinout
conversion), including the VESA DDC channel (I2C for resolution auto-
discovery etc). Correct me if I'm wrong.
Another question: I know that the LCD TV's tend to perform various
sorts of filtering on the TV picture, to improve the color space,
suppress motion artifacts, perform proper resolution conversion/
interpolation, suppress noise, increase video sharpness etc. The
proprietary names on those algorithms vary among consumer TV
manufacturers, but they're all just DSP filters, some of them applying
a FIR filter or something across a group of nearby pixels. I've
watched several brands and models of LCD TV's with a regular TV
signal, and from a relatively far distance already you can see pixels
wobbling about the edges - difficult to say what those were, whether
DVB-T (DCT) compression artifacts, sharpening / noise suppression /
motion compensation artifacts, or what the hell... Now my question is:
if I attach a DVI input from a PC, does this get messed up by all
those filters too, or is it just passed through to the display, maybe
with just a bit of color correction? Does the LCD TV warp/smear edges
on my Windows?
Next, a question about resolution fitting - how difficult is it to fit
the graphics adapter's output resolution to the LCD TV's screen
resolution, in practice? I'd definitely run the display at its native
resolution. Non-integer multiples cause smear, exactly half the
resolution is too small (pixels too big). Let's speak 1366x768 or
1920x1080. Let's abstract from those computerish native resolutions
inbetween, occurring in some cheap HD-Ready LCD TV's, such as 1440x900
etc. Those two resolutions, 768p and 1080p, are not standard PC
display resolutions. To what extent is this handled by the VESA DDC
PnP/auto-discovery, with what success rate on various different VGA
boards? I know that I can tell Windows to ignore the VESA DDC
information and allow me to select from all the resolutions supported
"by the hardware", but these are still only a limited set (hardwired
in the software driver) where those TV resolutions may be missing. I
also know that virtually any VGA board out there can be set to just
about any resolution/screen aspect ratio in bare metal (limited only
by its clock generators, maximum DAC clock and maybe a multiple of 8
pixels), which you can easily verify by rolling your own "modeline" in
Xwindows on Unix (I even had an excel sheet to automate the maths for
me in Xfree86 3.3.6), but in MS Windows you have no generic way of
forcing your VGA to use a specific front porch, rear porch, blank rows
etc.
So the question is, what VGA board or on-chip north-bridge-integrated
graphics subsystem supports those two resolutions in Windows? I would
guess that my chances are good especially with more recent hardware.
Likely not with VGA boards from 4 years ago,
that used to max out at 1900x1200... What about "pivot mode" portrait
orientation?
Hmm... maybe I'm worried unnecessarily, looking at my current Nvidia
TNT2 I can see that it does support 1920x1080 and 1360x768 (not
1366x768). I could stick with that, if it wasn't for the analog-only
VGA output )
Still, any recommendations of particular VGA hardware would be
welcome...
The one last technical point that I'm slightly worried about is
brightness. The LCD TV's are brighter than computer LCD displays. The
basic brighness of an LCD TV is higher, because it is intended to be
watched from a greater distance. This is probably achieved by a more
powerful backlight: beefier CCFL tubes or more pcs of them. I can
tweak a CCFL inverter for higher or lower output power, but I'm
reluctant to do this on a shiny new display within warranty. I know
that some inverters have a brightness control input (analog/pwm/
digital). I've seen various LCD displays (not TV's) in notebooks or
stand-alone, some of them have the brightness adjustment range
relatively narrow, some have a separate control knob for brightness
and contrast but the two do effectively the same thing, and at the
bottom of the range the diplay is effectively useless due to low
contrast... Most legacy CRT's were much better at selective brightness
and contrast control.
So the question could be formulated this way: what's the usual
brightness adjustment range of an LCD TV today? Does it allow you to
drop the brightness enough to be any use as a PC monitor, without
losing contrast / breaking the color fidelity or something?
Now finally, what's available on the market:
The best (biggest) pixel size you can get from a common PC LCD display
nowadays is about 0.3 mm at 1280x1024 in a 19" display. Bigger
displays or even some 19" LCD's have higher resolutions and smaller
pixel size. If you want anything bigger than 0.3 mm, you have these
options:
A) get a PC display LCD and run it at sub-optimal resolution.
Non-integer fractional resolutions cause smear (the pixels aren't
sharp anyway), integer fractions (perhaps exactly half) waste too much
desktop space and at 0.5 - 0.6 mm the pixels get much too big.
B) get an LCD TV.
1366x768 in a 26" display would give you about 0.42 mm.
1920x1080 in a 32" display would give you about 0.37 mm.
My conclusion is that the "full HD" 32in TV is just about perfect,
but twice as expensive as a 26in "HD-ready" thing
C) get a "public display" PC LCD. Those seem to use TV LCD screens,
and are about twice as expensive as a corresponding LCD TV. So far
I've only ever seen 1366x768, never a full-HD display - but this may
change soon, as the full-HD LCD TV's
for the mass market have only started arriving in our shops last year
or so.
In my experience, those 0.3mm are a minimum for business use - we're
using a tabular business application running maximized across the full
screen, and every pixel of desktop real estate is valuable. And the
sort of 19" LCD's with this dot pitch seem to be getting old in the
manufacturers' product lines...
One funny point is, that a number of business apps would benefit from
a "portrait" orientation of the screen. Paper documents are created
natively with "portrait" orientation, most PDF manuals have this
orientation, HTML pages have a native top-down flow of text, resulting
in them being tall - any database-driven app or even a file manager
that produces a listing of rows, would likely benefit from potrait
orientation (more rows would squeeze on one screen). Makes me wonder
what it would look like to run a 32" Full-HD LCD pivoted into portrait
mode. Or maybe two of them side by side, in a dual-display setup )
The pivoted portrait orientation is nothing new, I guess Apple-based
DTP people have known this for some 20 years now, right? Any practical
notes about this, anyone?
Frank Rysanek
I know that this has been discussed here before, but the key
discussions took place some 3-4 years ago. I'm glad to know that other
people got the idea before me, and some of the answers were actually
informative - but I have further questions to ask...
I've been watching the developments in LCD's and other displays for
some time, and I may have to dump my old 17" CRT someday soon. I'm
wondering what to get next. The LCD's seem to be moving to ever higher
resolutions, but largely at the cost of pixel size. If you want bigger
pixel size, there are two practical ways to get that: a consumer-grade
LCD TV, or a "public display" LCD for an even higher price (with
similar parameters to an LCD TV).
The key question perhaps is about the right pixel size.
I know that a number of technology visionaires have been foaming about
hi-res displays and "digital paper" for a few years. Maybe three years
ago I've first seen a 15" notebook display at 1600x1200. I know that
Windows can be set to use larger system fonts or icon sizes. So what
am I fussing about?
Firstly, a number of software programs tend to use fixed-size fonts
and icons. Those are based on proportions from the days of 800x600 or
1024x768 - i.e. about 72 dpi, or a metric pixel size of about 0.35mm.
Frankly, I don't see much need for higher resolutions - those
approximately 72 dpi were just about right to get all the needed
information on screen and to watch such a screen on your desktop. The
CRT's electron ray was focused for just about this dot size. Your
distance from the display hasn't changed over the years, so making the
pixel size smaller doesn't make much sense - it doesn't help you on
your job, be it databases/accounting, coding, bitmap or vector
graphics. Based on several years of practical experience and use-cases
(with my human colleagues serving as test dummies), all I can say is
this: if you get a higher-resolution display, you end up gazing at it
from a closer distance, to be able to see the details you need to see.
I've seen several geeks shine childish joy over the packaging when
their new hi-res LCD arrived, then gradually turn their face to a
picture of despair when they needed to get some work done...
Hell, I don't want to be bending over forward, to gaze at my screen
from a 30 cm distance! I much prefer to lean back in my office chair
and make myself comfy!
Sometimes I have to work with my office PC for long hours - the eyes
get weary, the body needs maximum possible sitting comfort, still your
"balance maintaining system" tends to get dizzy... The best thing you
can do is get the sharpest possible display that you can watch from
just the right distance.
Now I'm getting to another point that I'm somewhat unsure about.
Nowadays I still have a 17" CRT. I used to have several such monitors
before. At just the right resolution (about 75 dpi), the pixels tend
to get somewhat hazy. I can't see the sharp corners anymore, because
the CRT smears them (the electron ray gets too thick). What's worse,
after a day's work, my eyes tend to give up focusing on the pixel
corners, and just focus enough to see what's left by the CRT. I know
that my eyes can do better. But they just give up. At the end of the
shift you get up and walk out into the street, and find out that you
have trouble getting your eyes to focus fully again... A good night's
sleep helps, a week of vaccations helps even more... but it makes me
wonder if a better display could alleviate the problem.
Does anyone have medical-oriented references related to this issue?
What's better: hazy CRT pixels, tiny LCD pixels with bigger fonts, or
bigger but precisely sharp cornered pixels that you can tightly focus
on? "Too big and visibly pixelated", but sharp? Which is best for your
eyesight?
I don't consider myself "visually handicapped", I don't wear glasses -
maybe an ophtalmologist would tell me otherwise if I paid a visit for
the first time in my life I don't believe that glasses would help
me anyway. I know that my wife's eyesight used to be worse than mine
(she wore some thin glasses), before she got her eyes trimmed by laser
surgery - nowadays her eyesight is better than mine. The "computer
display weariness" makes my eyesight noticeably worse, as does runny
nose etc.
Back to technical topics:
How sharp are the LCD pixels anyway? Can you always see the three RGB
segments, at a close look? Or just a sharp square pixel with some
homogeneous color, in a thin separating grid maybe?
I know that LCD's connected via analog VGA (RGB) input tend to have
the pixels slightly smeared (should be within one pixel's dimension),
or even different RGB components slightly shifted against each other.
All this should be avoided by a proper digital interconnect - be it
TTL, LVDS or the DVI-style signaling (including VESA DFP and HDMI).
So far I've seen a single LCD TV connected to a PC - unfortunately the
only option at that time was a VGA interconnect, and that input on the
TV was *horribly* smeared in the horizontal dimension - essentially a
low-pass filter on the scan line's luminance signal. The smear was
maybe 4 pixels wide. Apalling experience. But again, I guess proper
DVI handles all that.
DVI can be converted to HDMI with a mere conversion cable (pinout
conversion), including the VESA DDC channel (I2C for resolution auto-
discovery etc). Correct me if I'm wrong.
Another question: I know that the LCD TV's tend to perform various
sorts of filtering on the TV picture, to improve the color space,
suppress motion artifacts, perform proper resolution conversion/
interpolation, suppress noise, increase video sharpness etc. The
proprietary names on those algorithms vary among consumer TV
manufacturers, but they're all just DSP filters, some of them applying
a FIR filter or something across a group of nearby pixels. I've
watched several brands and models of LCD TV's with a regular TV
signal, and from a relatively far distance already you can see pixels
wobbling about the edges - difficult to say what those were, whether
DVB-T (DCT) compression artifacts, sharpening / noise suppression /
motion compensation artifacts, or what the hell... Now my question is:
if I attach a DVI input from a PC, does this get messed up by all
those filters too, or is it just passed through to the display, maybe
with just a bit of color correction? Does the LCD TV warp/smear edges
on my Windows?
Next, a question about resolution fitting - how difficult is it to fit
the graphics adapter's output resolution to the LCD TV's screen
resolution, in practice? I'd definitely run the display at its native
resolution. Non-integer multiples cause smear, exactly half the
resolution is too small (pixels too big). Let's speak 1366x768 or
1920x1080. Let's abstract from those computerish native resolutions
inbetween, occurring in some cheap HD-Ready LCD TV's, such as 1440x900
etc. Those two resolutions, 768p and 1080p, are not standard PC
display resolutions. To what extent is this handled by the VESA DDC
PnP/auto-discovery, with what success rate on various different VGA
boards? I know that I can tell Windows to ignore the VESA DDC
information and allow me to select from all the resolutions supported
"by the hardware", but these are still only a limited set (hardwired
in the software driver) where those TV resolutions may be missing. I
also know that virtually any VGA board out there can be set to just
about any resolution/screen aspect ratio in bare metal (limited only
by its clock generators, maximum DAC clock and maybe a multiple of 8
pixels), which you can easily verify by rolling your own "modeline" in
Xwindows on Unix (I even had an excel sheet to automate the maths for
me in Xfree86 3.3.6), but in MS Windows you have no generic way of
forcing your VGA to use a specific front porch, rear porch, blank rows
etc.
So the question is, what VGA board or on-chip north-bridge-integrated
graphics subsystem supports those two resolutions in Windows? I would
guess that my chances are good especially with more recent hardware.
Likely not with VGA boards from 4 years ago,
that used to max out at 1900x1200... What about "pivot mode" portrait
orientation?
Hmm... maybe I'm worried unnecessarily, looking at my current Nvidia
TNT2 I can see that it does support 1920x1080 and 1360x768 (not
1366x768). I could stick with that, if it wasn't for the analog-only
VGA output )
Still, any recommendations of particular VGA hardware would be
welcome...
The one last technical point that I'm slightly worried about is
brightness. The LCD TV's are brighter than computer LCD displays. The
basic brighness of an LCD TV is higher, because it is intended to be
watched from a greater distance. This is probably achieved by a more
powerful backlight: beefier CCFL tubes or more pcs of them. I can
tweak a CCFL inverter for higher or lower output power, but I'm
reluctant to do this on a shiny new display within warranty. I know
that some inverters have a brightness control input (analog/pwm/
digital). I've seen various LCD displays (not TV's) in notebooks or
stand-alone, some of them have the brightness adjustment range
relatively narrow, some have a separate control knob for brightness
and contrast but the two do effectively the same thing, and at the
bottom of the range the diplay is effectively useless due to low
contrast... Most legacy CRT's were much better at selective brightness
and contrast control.
So the question could be formulated this way: what's the usual
brightness adjustment range of an LCD TV today? Does it allow you to
drop the brightness enough to be any use as a PC monitor, without
losing contrast / breaking the color fidelity or something?
Now finally, what's available on the market:
The best (biggest) pixel size you can get from a common PC LCD display
nowadays is about 0.3 mm at 1280x1024 in a 19" display. Bigger
displays or even some 19" LCD's have higher resolutions and smaller
pixel size. If you want anything bigger than 0.3 mm, you have these
options:
A) get a PC display LCD and run it at sub-optimal resolution.
Non-integer fractional resolutions cause smear (the pixels aren't
sharp anyway), integer fractions (perhaps exactly half) waste too much
desktop space and at 0.5 - 0.6 mm the pixels get much too big.
B) get an LCD TV.
1366x768 in a 26" display would give you about 0.42 mm.
1920x1080 in a 32" display would give you about 0.37 mm.
My conclusion is that the "full HD" 32in TV is just about perfect,
but twice as expensive as a 26in "HD-ready" thing
C) get a "public display" PC LCD. Those seem to use TV LCD screens,
and are about twice as expensive as a corresponding LCD TV. So far
I've only ever seen 1366x768, never a full-HD display - but this may
change soon, as the full-HD LCD TV's
for the mass market have only started arriving in our shops last year
or so.
In my experience, those 0.3mm are a minimum for business use - we're
using a tabular business application running maximized across the full
screen, and every pixel of desktop real estate is valuable. And the
sort of 19" LCD's with this dot pitch seem to be getting old in the
manufacturers' product lines...
One funny point is, that a number of business apps would benefit from
a "portrait" orientation of the screen. Paper documents are created
natively with "portrait" orientation, most PDF manuals have this
orientation, HTML pages have a native top-down flow of text, resulting
in them being tall - any database-driven app or even a file manager
that produces a listing of rows, would likely benefit from potrait
orientation (more rows would squeeze on one screen). Makes me wonder
what it would look like to run a 32" Full-HD LCD pivoted into portrait
mode. Or maybe two of them side by side, in a dual-display setup )
The pivoted portrait orientation is nothing new, I guess Apple-based
DTP people have known this for some 20 years now, right? Any practical
notes about this, anyone?
Frank Rysanek