does display dpi size hurt video card or monitor

  • Thread starter Thread starter zmartha
  • Start date Start date
Z

zmartha

Do you know if I run my display that is meant to be run at 1280 x 1024 does
it hurt the video card or the monitor, to move it down to 1024 x 768? I
like larger fonts and stuff.
 
zmartha said:
Do you know if I run my display that is meant to be run at 1280 x 1024 does
it hurt the video card or the monitor, to move it down to 1024 x 768? I
like larger fonts and stuff.
Nope.....set it to whatever you want :)

--
Servo
"You gonna do something? Or just stand there and bleed?"
tservo100 at
ameritech dot net
Slow, fiery death to all spammers!!!
 
zmartha said:
Do you know if I run my display that is meant to be run at 1280 x 1024 does
it hurt the video card or the monitor, to move it down to 1024 x 768? I
like larger fonts and stuff.

I take it that you refer to an LCD which has a native resolution of 1280 X
1024 then running it in a lower resolution which is not evenly divisible by
some integer then you will probably some loss of display quality. Meaning
that the display which looks fine in 1280 X 1024 will probably have somewhat
ragged fonts in the proposed 1024 X 768. Note: CRT displays do not suffer
from this effect. But, no, you won't physically "hurt" anything unless you
include your eyes and sensibilities so if you don't find the scaling effect
objectionable feel free to give it a shot and see if the results are hideous
or not.

Have you tried simply enlarging the display fonts and such? This is easily
done in Windows and should be equally easy in any modern OS.
 
You did not say if you have an LCD or CRT type monitor. But assuming you
are using an LCD monitor, then you will NOT "hurt anything" by running at a
lower resolution, but the image quality will not be nearly as good as it
whould be running at its native resolution.
 
Do you know if I run my display that is meant to be run at 1280 x 1024 does
it hurt the video card or the monitor, to move it down to 1024 x 768?

I don't know about LCD, but with a CRT, the higher
you set the video resolution, the higher the horizontal
sweep frequency goes. That can really warm up a
flyback transformer, and shorten its life quite a bit. I also
believe the higher the H-freq goes, the higher the DC
tube voltage goes, and that tends to "focus" the e-beam
down and heat the phosphor more .. shortening its life
too. So lowering the resolution to 1024 x 768 actually
is good for a CRT. You want to set the vertical freq
to 75 hz to get the most normal picture size and no
blinking. About LCDs, I know nothing except the
resolution really stinks. I've been fixing these things for
20 years, and I have not seen crap like that since my
EGA days.

johns
 
johns said:
too. So lowering the resolution to 1024 x 768 actually
is good for a CRT. You want to set the vertical freq
to 75 hz to get the most normal picture size and no
blinking.

Picture size has nothing to do with vertical freq. You set the card to the
freq. you want to run, then use the monitor controls to adjust the picture
size, rotation, pincushion, etc.

75 Hz is a bit low, especially against a white background. 85 Hz is
considered the minimum for a flicker-free image. I run mine at 100 Hz. If
you are worried about monitor life, just remember that eye exams and glasses
cost more than a monitor replacement...

Note, refresh rates only apply to CRTs. The pixels in an LCD panel remain
stationary until a color change is needed. This means for normal office apps
and CAD work, an LCD panel is the most comfortable on the eye. However, when
a color change is needed, the pixel response time is at best 25 ms,
effectively giving you only 40 Hz in a fast-changing image. In a fast game,
you get "ghosting".
About LCDs, I know nothing except the
resolution really stinks. I've been fixing these things for
20 years, and I have not seen crap like that since my
EGA days.

Resolution only stinks on early LCDs. Today's 15"-17" LCD panels easily do
1024x768.
 
A LOT of people see no flicker even at 70 Hz. The 85 Hz myth was started by the industry
in order to sell new gear. Only a small percentage of the population need 85 Hz. I have
yet to see flicker at 70 Hz, though I run my monitor at 1600x1200 @ 75 Hz.
 
How much contrast are you running? If the contrast is set too dark then the
flicker won't be too apparent. Contrast should be set to 100% to get maximum
picture clarity.

Switch back and forth between 75 and 85 Hz. The difference is quite
noticeable.
 
Picture size has nothing to do with vertical freq. You set the card to the
freq. you want to run, then use the monitor controls to adjust the picture
size, rotation, pincushion, etc.

Not directly. The size is a function of the high voltage "tightness",
and you can see it when changing vertical frequency. I just don't
like the idea that 85 hz vertical is about 30 percent smaller than
75 hz before any adjustments are made. I feel like the adjustments
to expand the screen are bucking the high voltage. That can't be
good because it must be a resistor devider trying to split that
highV down to "bloom" the screen. At least that is what I think
it is doing. I just fix 'em. I don't invent them.

johns
 
yet to see flicker at 70 Hz, though I run my monitor at 1600x1200 @ 75 Hz.

Lordy! Knob it down to 1024 and see if you can
hear the flyback stop whistling. If you hear a small
squeal go away, then you are cooking your flyback.

johns
 
I set up my NEC MultiSync FE950+ monitor according to the instructions and charts
available here:

http://www.normankoren.com/makingfineprints1A.html

It is matched to the desired 2.2 on the gamma scale. Follow the instructions and your
monitor also will be properly calibrated. Refresh rates over 70 Hz work perfectly with
this Diamondtron monitor.

My understanding of contrast is that it is merely another tool in calibrating the monitor,
not something to arbitrarily set at 100%. I've heard that compared to redlining your
engine 100% of the time. In fairly short order you will do damage to the device.
 
It's a 19" trinitron style monitor. It runs fine at this resolution and refresh rate and
has for years.
 
John,
I have used the lower resolution on the LCD since I got it last Nov. and
have noticed No distortion. But I am trying a new 64mb (ATI this time)
video card as I was having more and more trouble with all distortion at
morning startups. So far, so good. But I wondered if the lower resolution
wore the card out. I used it also on my CRT. Thanks for the help.
 
Johns:
The LCD resolution is wonderful, but then my eyes are not like most. I
can see flickers in most people's monitors--mostly CRT--when they can't see
anything. Of course, I have this LCD frequency up to 70 which is within its
range and makes for a clear picture.
 
First said:
How much contrast are you running? If the contrast is set too dark then
the flicker won't be too apparent. Contrast should be set to 100% to get
maximum picture clarity.

Switch back and forth between 75 and 85 Hz. The difference is quite
noticeable.

For some people. I've had students who could see flicker at 85 and others
that couldn't at 60. All depends on individual response. Dammme I should
have had a segment in which they determined their own flicker limits. NOW
I think of it.
 
johns said:
Not directly. The size is a function of the high voltage "tightness",
and you can see it when changing vertical frequency. I just don't
like the idea that 85 hz vertical is about 30 percent smaller than
75 hz before any adjustments are made. I feel like the adjustments
to expand the screen are bucking the high voltage. That can't be
good because it must be a resistor devider trying to split that
highV down to "bloom" the screen. At least that is what I think
it is doing. I just fix 'em. I don't invent them.

Huh? Image size is controlled by the amount of horizontal and vertical
deflection and has nothing to do with the high voltage. Increase the
current in the deflection coils a little and the image gets larger.
 
The directions on the web site are for setting up the monitor for
print-proofing, to "humble" the displayed picture to match the eventual
print output. The guide is good desktop publishing, but not if you use the
computer for games, videos, and the web. For that, a dedicated monitor
calibration/testing program like Nokia Monitor Test or DisplayMate should be
used. (At least so you could adjust RGB convergence and check for moire...)
In those programs, most monitors invariably end up at 100% contrast...

Remember, the monitor is a luminous light source; paper can only reflect
light. Just because something looks decent on paper under yellowish
incandescent lighting doesn't mean it can't look better onscreen. Hell, even
paper comes with different brightness indices. 96-bright paper is regarded
as a higher grade than 88-bright paper, because brighter paper reflects more
light, *providing better contrast*.

The monitor is not a car engine. Using max contrast is more like running a
variable-speed cooling fan at full-speed all the time. Any monitor worth its
desktop footprint can handle 100% contrast with safety margins built-in.
During the development and testing cycle, the test engineer would crank up
the contrast until it fails, then back down several notches and set that as
the max value in the OSD. As power users we do far worse things to hardware,
like cranking up RAM voltage to over 3 V for overclocking...
 
I have used the Nokia Monitor Test and its test displays and gray scale show perfectly
tuned with my monitor as calibrated from the so-called desktop publishing site. It is
tuned to 6500 degrees and is in perfect shape.

It works wonderfully for games or anything else.

But suit yourself.
 
Back
Top