LCD scanning frequency

  • Thread starter Thread starter Guest
  • Start date Start date
G

Guest

I was looking at specs for an LCD monitor and noticed it included scanning
frequency and listed the vertical frequency at 56-75 Hz. Now I could see
there being an upper limit perhaps even so low as 75 Hz. But a lower limit
of 56 Hz?? That doesn't make a whole lot of sense to me. Why can't any
LCD handle significantly slower? Isn't it supposed to be all electrical?
 
I was looking at specs for an LCD monitor and noticed it included scanning
frequency and listed the vertical frequency at 56-75 Hz. Now I could see
there being an upper limit perhaps even so low as 75 Hz. But a lower limit
of 56 Hz?? That doesn't make a whole lot of sense to me. Why can't any
LCD handle significantly slower? Isn't it supposed to be all electrical?


It is more economical to design it to accept within a
certain range, plus designing for sub-56Hz would be
unnecessary since no video cards go lower than this. The
reason it's not just an electrical issue is that this is a
data signal which must meet certain timing parameters. It's
not just amplified and projected onto a medium, it is
digitized and matrixed onto the LCD arrays at the rate the
controller is compatible with.

Your concern will not effect your use. Generally the system
is set to use 60Hz and all LCDs (AFAIK) accept this
frequency. Unlike with CRT, there is no need to change this
in an attempt to reduce flicker because LCD is a different
technology not having flicker based on too low a refresh
rate.
 
| On 2 Sep 2007 21:53:26 GMT, (e-mail address removed) wrote:
|
|>I was looking at specs for an LCD monitor and noticed it included scanning
|>frequency and listed the vertical frequency at 56-75 Hz. Now I could see
|>there being an upper limit perhaps even so low as 75 Hz. But a lower limit
|>of 56 Hz?? That doesn't make a whole lot of sense to me. Why can't any
|>LCD handle significantly slower? Isn't it supposed to be all electrical?
|
|
| It is more economical to design it to accept within a
| certain range, plus designing for sub-56Hz would be
| unnecessary since no video cards go lower than this. The
| reason it's not just an electrical issue is that this is a
| data signal which must meet certain timing parameters. It's
| not just amplified and projected onto a medium, it is
| digitized and matrixed onto the LCD arrays at the rate the
| controller is compatible with.

Well, you're wrong about "no video cards go lower than this" because I can
get frame rates down to 24 Hz from mine, and I suspect even lower is possible.
I'll have to get an oscilloscope to see just how low I can set it and keep
the signal coming out correctly.

FYI, high definition TV signals range from 60 Hz at the highest down to
a low of just above 23.976 Hz. All LCDs should be required to go down
to as low as 23.976 Hz.


| Your concern will not effect your use. Generally the system
| is set to use 60Hz and all LCDs (AFAIK) accept this
| frequency. Unlike with CRT, there is no need to change this
| in an attempt to reduce flicker because LCD is a different
| technology not having flicker based on too low a refresh
| rate.

This is why a lower frame rate is now usable. On CRT it will flicker so
it is not a usable option. On LCD is will not, so it is possible to use
a lower frame rate.

Why use a lower frame rate?

With larger and larger LCD displays, you have higher and higher native
geometry. Video cards can often handle these higher geometries without
having to be redesigned or replaced to get a higher clock rate. The
clock rate is the first barrier I hit on my video card when trying to
raise the geometry. By using a lower frame rate, the geometry can be
achieved. My CRT actually works (down to about 40 Hz), but the flicker
is terrible. My LCD just says the video is out of range.

Higher rates require better circuits inside the LCD for everything. But
a lower rate only requires a few things like ranging the analog sample
clock to a lower rate (not hard to do considering that it is a synthesized
oscillator).

I've seen an LCD display offered with 2560x1600 native. But it requires
DUAL DVI connections because the pixel clock rate is too high. But with
a lower frame rate that could have been done with a single DVI connection.
 
| On 2 Sep 2007 21:53:26 GMT, (e-mail address removed) wrote:
|
|>I was looking at specs for an LCD monitor and noticed it included scanning
|>frequency and listed the vertical frequency at 56-75 Hz. Now I could see
|>there being an upper limit perhaps even so low as 75 Hz. But a lower limit
|>of 56 Hz?? That doesn't make a whole lot of sense to me. Why can't any
|>LCD handle significantly slower? Isn't it supposed to be all electrical?
|
|
| It is more economical to design it to accept within a
| certain range, plus designing for sub-56Hz would be
| unnecessary since no video cards go lower than this. The
| reason it's not just an electrical issue is that this is a
| data signal which must meet certain timing parameters. It's
| not just amplified and projected onto a medium, it is
| digitized and matrixed onto the LCD arrays at the rate the
| controller is compatible with.

Well, you're wrong about "no video cards go lower than this" because I can
get frame rates down to 24 Hz from mine, and I suspect even lower is possible.
I'll have to get an oscilloscope to see just how low I can set it and keep
the signal coming out correctly.

Of course lower is technically possible but not a practical
design goal for a PC video card. Thus, video cards really
don't go lower because you'd have to change the default
behavior including the driver settings, which nobody in
their right mind would do to output from a PC to an LCD.


FYI, high definition TV signals range from 60 Hz at the highest down to
a low of just above 23.976 Hz. All LCDs should be required to go down
to as low as 23.976 Hz.

You're talking about frames per second, yes? That's not
what is being referred to previously by 56Hz, it does not
only signal at the rate of frame change and no PC use LCDs
need to support below 60Hz to display HD.



| Your concern will not effect your use. Generally the system
| is set to use 60Hz and all LCDs (AFAIK) accept this
| frequency. Unlike with CRT, there is no need to change this
| in an attempt to reduce flicker because LCD is a different
| technology not having flicker based on too low a refresh
| rate.

This is why a lower frame rate is now usable. On CRT it will flicker so
it is not a usable option. On LCD is will not, so it is possible to use
a lower frame rate.

Why use a lower frame rate?

With larger and larger LCD displays, you have higher and higher native
geometry. Video cards can often handle these higher geometries without
having to be redesigned or replaced to get a higher clock rate. The
clock rate is the first barrier I hit on my video card when trying to
raise the geometry. By using a lower frame rate, the geometry can be
achieved.

No, all you need is a data link tech that allows enough
bandwidth. ~ 24FPS looks like flickery crap on an LCD not
because of the flickering seen on a CRT but because many
people can perceive too much difference between successive
frames in motion scenes. You want the FPS higher and the
signaling rate higher still.



My CRT actually works (down to about 40 Hz), but the flicker
is terrible. My LCD just says the video is out of range.




Higher rates require better circuits inside the LCD for everything. But
a lower rate only requires a few things like ranging the analog sample
clock to a lower rate (not hard to do considering that it is a synthesized
oscillator).

I've seen an LCD display offered with 2560x1600 native. But it requires
DUAL DVI connections because the pixel clock rate is too high. But with
a lower frame rate that could have been done with a single DVI connection.

It doesn't mean the rate is "too high" to the extend that
something should be done to lower it, the goal is to retain
image quality, not degrade it just to make it possible with
a single DVI cable.
 
| On 29 Sep 2007 13:10:06 GMT, (e-mail address removed)
| wrote:
|
|>| On 2 Sep 2007 21:53:26 GMT, (e-mail address removed) wrote:
|>|
|>|>I was looking at specs for an LCD monitor and noticed it included scanning
|>|>frequency and listed the vertical frequency at 56-75 Hz. Now I could see
|>|>there being an upper limit perhaps even so low as 75 Hz. But a lower limit
|>|>of 56 Hz?? That doesn't make a whole lot of sense to me. Why can't any
|>|>LCD handle significantly slower? Isn't it supposed to be all electrical?
|>|
|>|
|>| It is more economical to design it to accept within a
|>| certain range, plus designing for sub-56Hz would be
|>| unnecessary since no video cards go lower than this. The
|>| reason it's not just an electrical issue is that this is a
|>| data signal which must meet certain timing parameters. It's
|>| not just amplified and projected onto a medium, it is
|>| digitized and matrixed onto the LCD arrays at the rate the
|>| controller is compatible with.
|>
|>Well, you're wrong about "no video cards go lower than this" because I can
|>get frame rates down to 24 Hz from mine, and I suspect even lower is possible.
|>I'll have to get an oscilloscope to see just how low I can set it and keep
|>the signal coming out correctly.
|
| Of course lower is technically possible but not a practical
| design goal for a PC video card. Thus, video cards really
| don't go lower because you'd have to change the default
| behavior including the driver settings, which nobody in
| their right mind would do to output from a PC to an LCD.

Why not? If the video card has an upper limit on pixel clock AND you
want a larger geometry, then slowing the horizontal (to get more pixels
per line) and the vertical (to get more lines per frame) is simply going
to be what you do.


|>FYI, high definition TV signals range from 60 Hz at the highest down to
|>a low of just above 23.976 Hz. All LCDs should be required to go down
|>to as low as 23.976 Hz.
|
| You're talking about frames per second, yes? That's not
| what is being referred to previously by 56Hz, it does not
| only signal at the rate of frame change and no PC use LCDs
| need to support below 60Hz to display HD.

Then what do you think the 56 Hz refers to? The horizontal? The
rate your eye blinks?

What the TV world refers to as frame rate is what the computer world
refers to as vertical frequency. A "1080p24" HD TV signal has a frame
rate of 24 frames per second (or 1000/1001 of that rate if locked to
legacy NTSC reference clock timing), and that is 24 Hz or 23.976 Hz.

Many video cards have pixel clocks with limits in the low 100's of MHz
and plenty enough memory to run with geometries of 2048x1536 or more.
With the max pixel clock and full utilization of memory to get higher
geometries, you do get lower frame rates (vertical Hz).

I have an _old_ Matrox Millennium card that can run up to 1280x1024.
But at that geometry, even with the pixel clock set to the maximum the
card supports, the frame rate (vertical Hz) is so low it flickes like
mad on a CRT. Switch to LCD and it's great ... if the LCD can go that
low.

CRT is on the way out. LCD is on the way in (although OLED may bump
that in a few or so years). Old assumptions about flicker just do not
apply, anymore.

Some people _need_ high frame rates for legitimate purposes, such as
gaming, and watching high action video. HD TV has the 720p60 video mode
for such things. LCDs will need to support that, and even higher. But
LCDs need to support lower as well since this enables higher geometries
like 2560x1600 for other kinds of display usage that doesn't have any
need for 50 to 60 Hz frame rates. Rates down to 20 frames per second,
when there is no flicker involved, are quite suitable for a large class
of computer work.

It would cost very _little_ more for an LCD to work at a lower frame
rate (the cost of thwacking the engineer in the forhead in most cases).
It would cost a lot _more_ to make a video card that goes to quadruple
the clock rate when the geometry is doubled on both dimensions. Video
with 2560x1600 pixels is beyond the limits of even a single DVI stream
at this time, when running at 50 Hz. Cut that to 25 Hz and it could
be done.

Let the gamers have their super high end video cards with 3 GHz clocks
that cost hundreds of dollars. Video cards for the masses that do not
need the high frame rates can be made much lower cost, while still
having nice high geometries (thanks to cheap RAM), by simply having
LCD electronics accept even lower video frequencies/rates, down at least
to 23.976 Hz (so it can at least display NTSC locked 24 fps movies fed
directly to it without requiring overscan features in the player).


| No, all you need is a data link tech that allows enough
| bandwidth. ~ 24FPS looks like flickery crap on an LCD not
| because of the flickering seen on a CRT but because many
| people can perceive too much difference between successive
| frames in motion scenes. You want the FPS higher and the
| signaling rate higher still.

24 fps is the common frame rate movies are shot it. But for any video
content that does involve a lot of motion, like sports (I watch NASCAR
events, for example), then of course a higher frame rate is a must.
But just because fast action video needs a high frame rate should not
mean I have to give up extreme geometry at a low price. Just change
the friggin video mode! 2560x1600 desktop geometry is overkill for a
1280x720 (60 fps) video transmission from Fox Sports. Do 2560x1600 at
The pixel clocking rate to get 2560x1600 at 24 fps (98304000 pxps) is
more than enough to get 1280x720 at 60 fps (55296000 pxps) for action
video.

Use the right frame rate for the content. You don't need the frame
rate that's right for watching cars whiz by at nearly 200 mph when
you are writing that report for the boss at work.


|>Higher rates require better circuits inside the LCD for everything. But
|>a lower rate only requires a few things like ranging the analog sample
|>clock to a lower rate (not hard to do considering that it is a synthesized
|>oscillator).
|>
|>I've seen an LCD display offered with 2560x1600 native. But it requires
|>DUAL DVI connections because the pixel clock rate is too high. But with
|>a lower frame rate that could have been done with a single DVI connection.
|
| It doesn't mean the rate is "too high" to the extend that
| something should be done to lower it, the goal is to retain
| image quality, not degrade it just to make it possible with
| a single DVI cable.

A lower frame rate does not degrade the image quality. It degrades MOTION
quality. But if it is not a high motion content, it doesn't matter. Movies
shot at 24 fps are unaffected when displayed at 24 fps (duh). But when you
lower the frame rate, you open the opportunity for an even higher image
quality for the kinds of content that beg for greater geometry (like office
desktop usage).

Use the right frame rate for the content. You don't need the frame
rate that's right for watching cars whiz by at nearly 200 mph when
you are writing that report for the boss at work.
 
| Of course lower is technically possible but not a practical
| design goal for a PC video card. Thus, video cards really
| don't go lower because you'd have to change the default
| behavior including the driver settings, which nobody in
| their right mind would do to output from a PC to an LCD.

Why not? If the video card has an upper limit on pixel clock AND you
want a larger geometry, then slowing the horizontal (to get more pixels
per line) and the vertical (to get more lines per frame) is simply going
to be what you do.

No you don't _want_ to do that, because there's no gain in
doing it, only degradation in visual quality. This is until
the data link bandwidth is exceeded, at which point a second
data link is used. It is used for a reason - to preserve
visual quality and/or allow for it, instead of degrading it
with your proposed solution.


|>FYI, high definition TV signals range from 60 Hz at the highest down to
|>a low of just above 23.976 Hz. All LCDs should be required to go down
|>to as low as 23.976 Hz.
|
| You're talking about frames per second, yes? That's not
| what is being referred to previously by 56Hz, it does not
| only signal at the rate of frame change and no PC use LCDs
| need to support below 60Hz to display HD.

Then what do you think the 56 Hz refers to? The horizontal? The
rate your eye blinks?

Fair enough, it is the framerate but the 24Hz is a lower
limit one doesn't want to cause, it is only due to the
format of the video content. There is no need to cause
this state.
What the TV world refers to as frame rate is what the computer world
refers to as vertical frequency. A "1080p24" HD TV signal has a frame
rate of 24 frames per second (or 1000/1001 of that rate if locked to
legacy NTSC reference clock timing), and that is 24 Hz or 23.976 Hz.

This doesn't matter, it is only a lower limit. Slowing down
one end to the slowest the other would ever require is a
backwards concept. On a PC (thus using LCD PC monitor),
both ends can support faster than that.
Many video cards have pixel clocks with limits in the low 100's of MHz
and plenty enough memory to run with geometries of 2048x1536 or more.
With the max pixel clock and full utilization of memory to get higher
geometries, you do get lower frame rates (vertical Hz).

Why would you use an old video card that can't support 60Hz
with a newer high resolution LCD? That degrades the quality
regardless of a video/HD content scenario.
I have an _old_ Matrox Millennium card that can run up to 1280x1024.
But at that geometry, even with the pixel clock set to the maximum the
card supports, the frame rate (vertical Hz) is so low it flickes like
mad on a CRT. Switch to LCD and it's great ... if the LCD can go that
low.

So the problem is not refresh rate, it's trying to use
ancient hardware. Same could be said about trying to use
any very old low performance hardware for a modern more
demanding use.

CRT is on the way out. LCD is on the way in (although OLED may bump
that in a few or so years). Old assumptions about flicker just do not
apply, anymore.

Never said they did, but one could call the jerkiness of
24Hz flicker-like as well. 24 FPS is visually noticable on
an LCD.
Some people _need_ high frame rates for legitimate purposes, such as
gaming, and watching high action video.

Most people will benefit from more than 24FPS in typical
motion video uses. Most people don't have the problem you
suggest and attempt to solve by lowering the rate to 24.
 
| On 2 Oct 2007 12:54:39 GMT, (e-mail address removed) wrote:
|
|
|>| Of course lower is technically possible but not a practical
|>| design goal for a PC video card. Thus, video cards really
|>| don't go lower because you'd have to change the default
|>| behavior including the driver settings, which nobody in
|>| their right mind would do to output from a PC to an LCD.
|>
|>Why not? If the video card has an upper limit on pixel clock AND you
|>want a larger geometry, then slowing the horizontal (to get more pixels
|>per line) and the vertical (to get more lines per frame) is simply going
|>to be what you do.
|
| No you don't _want_ to do that, because there's no gain in
| doing it, only degradation in visual quality. This is until
| the data link bandwidth is exceeded, at which point a second
| data link is used. It is used for a reason - to preserve
| visual quality and/or allow for it, instead of degrading it
| with your proposed solution.

There is no second data link. A card with a second data link would cost
more money. The geometry can be achieved with a lower frame rate with
the same video card, with no degradation in quality for a great many
uses of graphical displays, including most business uses.


|>|>FYI, high definition TV signals range from 60 Hz at the highest down to
|>|>a low of just above 23.976 Hz. All LCDs should be required to go down
|>|>to as low as 23.976 Hz.
|>|
|>| You're talking about frames per second, yes? That's not
|>| what is being referred to previously by 56Hz, it does not
|>| only signal at the rate of frame change and no PC use LCDs
|>| need to support below 60Hz to display HD.
|>
|>Then what do you think the 56 Hz refers to? The horizontal? The
|>rate your eye blinks?
|
| Fair enough, it is the framerate but the 24Hz is a lower
| limit one doesn't want to cause, it is only due to the
| format of the video content. There is no need to cause
| this state.

The motion picture industry does not seem to agree with you. Of course
a motion picture projector would produce flicker if run at 24 shutter
openings per second. The figured out ages ago that by opening the
shutter 48 or 72 times a second, timed so that the openings were of the
same timing between them, where 2 or 3 openings would have the same
frame, that this eliminated the flicker (even though it did not change
the total accumulated time the shutter was open) for most eyes.

Were it the case that motion that changed only every 1/24 of second
were annoying, too, they would have change the system to a higher frame
rate. They did for special cases (action sports), but not for the
average movie.


|>What the TV world refers to as frame rate is what the computer world
|>refers to as vertical frequency. A "1080p24" HD TV signal has a frame
|>rate of 24 frames per second (or 1000/1001 of that rate if locked to
|>legacy NTSC reference clock timing), and that is 24 Hz or 23.976 Hz.
|
| This doesn't matter, it is only a lower limit. Slowing down
| one end to the slowest the other would ever require is a
| backwards concept. On a PC (thus using LCD PC monitor),
| both ends can support faster than that.

Making a video card that has a faster pixel clock rate costs more
money. Since these clock rates are already well into the UHF range,
it's not just a matter of faster oscillators. It requires better
quality components overall. It requires better silicon technology
to shift data bits around faster. It requires faster DACs in the
case of analog output. It requires the second DVI link because doing
such high rates over one link costs even more than two links costs.

Making a monitor capable of running slower (lowering the low end,
while keeping the high end where it is at, of the sampling range)
costs very little. The engineering design changes would require
parameters in the software to know of the slower rate, and change
the synthesizer to handle it by increasing the divide ratio range
by merely adding 1 or 2 bits. It may be the case that the parts
can already do this, and it is the software that imposes the limit.


|>Many video cards have pixel clocks with limits in the low 100's of MHz
|>and plenty enough memory to run with geometries of 2048x1536 or more.
|>With the max pixel clock and full utilization of memory to get higher
|>geometries, you do get lower frame rates (vertical Hz).
|
| Why would you use an old video card that can't support 60Hz
| with a newer high resolution LCD? That degrades the quality
| regardless of a video/HD content scenario.

It can support 90 Hz at the geometries of the day it came out. It
also is flexible enough that it can support extremes in horizontal
geometry or extremes in vertical geometry. And it was an open
architecture not encumbered by trying to hide how to use the card
in software other than their own.

Now if I combine extreme horizontal AND extreme vertical ... it works.
That is, the _card_ works. But logically, when you divide the clock
by the extreme horizontal and then again by the extreme vertical, you
get something below 50 Hz (the lower end of most LCD monitors it would
seem).


|>I have an _old_ Matrox Millennium card that can run up to 1280x1024.
|>But at that geometry, even with the pixel clock set to the maximum the
|>card supports, the frame rate (vertical Hz) is so low it flickes like
|>mad on a CRT. Switch to LCD and it's great ... if the LCD can go that
|>low.
|
| So the problem is not refresh rate, it's trying to use
| ancient hardware. Same could be said about trying to use
| any very old low performance hardware for a modern more
| demanding use.

It's trying to use open architecture hardware that can do things that no
modern card can do ... work with independent software.


|>CRT is on the way out. LCD is on the way in (although OLED may bump
|>that in a few or so years). Old assumptions about flicker just do not
|>apply, anymore.
|
| Never said they did, but one could call the jerkiness of
| 24Hz flicker-like as well. 24 FPS is visually noticable on
| an LCD.

Tell me which LCD monitor you have seen 24 fps video displayed?
Manufacturer and model, please.


|>Some people _need_ high frame rates for legitimate purposes, such as
|>gaming, and watching high action video.
|
| Most people will benefit from more than 24FPS in typical
| motion video uses. Most people don't have the problem you
| suggest and attempt to solve by lowering the rate to 24.

They can benefit by less costly hardware, whether older or newer, by running
the video at 24 fps when their type of use of the computer is compatible with
that slow frame rate.

When the type of use changes to something that needs a higher frame rate, I
most certainly expect the LCD monitor to handle it. I've never suggested
that the upper limit frame rate be reduced. Although broadcast video is
limited to no more than 60 fps (50 fps in Europe), as I understand it,
double that rate looks even better for certain high action video, such as
the race cars I like to watch. So I look forward to the day when the can
cover a NASCAR race with 1920x1080 progressive video at 120 fps. This will
need about 20 MHz of spectrum with 8VSB modulation as used in over the air
TV, and about 10 MHz of spectrum with 256QAM modulation as used over many
cable systems. This would pose problems for them since they slice the
spectrum up in 6 MHz chunks. Satellite, however, may be able to do it.
 
| On 2 Oct 2007 12:54:39 GMT, (e-mail address removed) wrote:
|
|
|>| Of course lower is technically possible but not a practical
|>| design goal for a PC video card. Thus, video cards really
|>| don't go lower because you'd have to change the default
|>| behavior including the driver settings, which nobody in
|>| their right mind would do to output from a PC to an LCD.
|>
|>Why not? If the video card has an upper limit on pixel clock AND you
|>want a larger geometry, then slowing the horizontal (to get more pixels
|>per line) and the vertical (to get more lines per frame) is simply going
|>to be what you do.
|
| No you don't _want_ to do that, because there's no gain in
| doing it, only degradation in visual quality. This is until
| the data link bandwidth is exceeded, at which point a second
| data link is used. It is used for a reason - to preserve
| visual quality and/or allow for it, instead of degrading it
| with your proposed solution.

There is no second data link. A card with a second data link would cost
more money. The geometry can be achieved with a lower frame rate with
the same video card, with no degradation in quality for a great many
uses of graphical displays, including most business uses.

Again, the goal is not to use old junk then degrade
standards to make it work, just get a normal video card like
anyone else. Even integrated video has no problem
outputting 60Hz at HD resolution.

What you are claiming is a problem, isn't.

| Fair enough, it is the framerate but the 24Hz is a lower
| limit one doesn't want to cause, it is only due to the
| format of the video content. There is no need to cause
| this state.

The motion picture industry does not seem to agree with you.

Untrue, their choice has no bearing except a lower limit,
not an upper limit.
Of course
a motion picture projector would produce flicker if run at 24 shutter
openings per second. The figured out ages ago that by opening the
shutter 48 or 72 times a second, timed so that the openings were of the
same timing between them, where 2 or 3 openings would have the same
frame, that this eliminated the flicker (even though it did not change
the total accumulated time the shutter was open) for most eyes.

Were it the case that motion that changed only every 1/24 of second
were annoying, too, they would have change the system to a higher frame
rate. They did for special cases (action sports), but not for the
average movie.

These things take time, nothing starts out as good as it
ends up in it's final state. For example we could say if HD
resolution mattered then they'd have just done that in the
first place too on all distribution content.

People can in fact see interruption to fluid motion at
24FPS. IMO, the threshold is a little closer to 35FPS but
of course will depend on one's visual acuity and alertness.

|>What the TV world refers to as frame rate is what the computer world
|>refers to as vertical frequency. A "1080p24" HD TV signal has a frame
|>rate of 24 frames per second (or 1000/1001 of that rate if locked to
|>legacy NTSC reference clock timing), and that is 24 Hz or 23.976 Hz.
|
| This doesn't matter, it is only a lower limit. Slowing down
| one end to the slowest the other would ever require is a
| backwards concept. On a PC (thus using LCD PC monitor),
| both ends can support faster than that.

Making a video card that has a faster pixel clock rate costs more
money.

Wrong. Today the rate is already high enough and it would
cost more money to make an additional, special downgraded
version that could only meet your antiquated limit. Instead
the most cost effective is to make video cards with large
market appeal and capable of multiple purposes.

Even the cheapest integrated video today has no problem
outputting 60Hz at HD resolution. It doesn't get any
cheaper than that.


| Never said they did, but one could call the jerkiness of
| 24Hz flicker-like as well. 24 FPS is visually noticable on
| an LCD.

Tell me which LCD monitor you have seen 24 fps video displayed?
Manufacturer and model, please.

You don't seem to understand. Every single LCD monitor on
earth, the factor is not the monitor, it is that the subject
watching can discern breaks in movement at a rate so low as
24 FPS. Maybe your eyes are bad and you can't see it, but
again it would be a matter of degrading something to meet a
bare minimum threshold instead of keeping it at current
quality and striving for even better.


|>Some people _need_ high frame rates for legitimate purposes, such as
|>gaming, and watching high action video.
|
| Most people will benefit from more than 24FPS in typical
| motion video uses. Most people don't have the problem you
| suggest and attempt to solve by lowering the rate to 24.

They can benefit by less costly hardware, whether older or newer, by running
the video at 24 fps when their type of use of the computer is compatible with
that slow frame rate.

<sigh>

I can see you still don't get it. That you have some
ancient Maxtrox card incapable of current performance levels
from even the cheapest hardware made in the last few
generations/years, is no support for your argument. Such an
ancient system that video card would be found in has many
limitations besides what the video card can output including
processing peformance, no DVD drive, ancient OS that may not
even be able to run the modern software for playback, and
would not typically have a modern high resolution LCD mated
with it for use.

Randomly selecting any cheap modern system there is no
problem doing what you suggest is problematic. Try it
sometime.
 
| On 3 Oct 2007 00:50:40 GMT, (e-mail address removed) wrote:
|
|>| On 2 Oct 2007 12:54:39 GMT, (e-mail address removed) wrote:
|>|
|>|
|>|>| Of course lower is technically possible but not a practical
|>|>| design goal for a PC video card. Thus, video cards really
|>|>| don't go lower because you'd have to change the default
|>|>| behavior including the driver settings, which nobody in
|>|>| their right mind would do to output from a PC to an LCD.
|>|>
|>|>Why not? If the video card has an upper limit on pixel clock AND you
|>|>want a larger geometry, then slowing the horizontal (to get more pixels
|>|>per line) and the vertical (to get more lines per frame) is simply going
|>|>to be what you do.
|>|
|>| No you don't _want_ to do that, because there's no gain in
|>| doing it, only degradation in visual quality. This is until
|>| the data link bandwidth is exceeded, at which point a second
|>| data link is used. It is used for a reason - to preserve
|>| visual quality and/or allow for it, instead of degrading it
|>| with your proposed solution.
|>
|>There is no second data link. A card with a second data link would cost
|>more money. The geometry can be achieved with a lower frame rate with
|>the same video card, with no degradation in quality for a great many
|>uses of graphical displays, including most business uses.
|
| Again, the goal is not to use old junk then degrade
| standards to make it work, just get a normal video card like
| anyone else. Even integrated video has no problem
| outputting 60Hz at HD resolution.

You are continuing to make assumptions. I do not have any such goal
to use "old junk". I use what works with all software.

Define "a normal video card". One that costs $100?

Sure, outputting at 60Hz is possible at some resolutions. But it is not
possible on all video cards at all resolutions. Video has been a difficult
problem due to this issue for years. LCDs can now solve that problem, but
only if they accept low frame rate video.

And you continue to ignore genuine 24 fps high definition video originating
in motion picture film production, which is another source of 24 fps, or in
some cases 23.976 fps, video.

The past problems have all been expensive to solve. For example a scan
doubler would have solved the video problems in the past for CRT displays.
But a scan doubler is very expensive so it is not a practical solution.

But what is NOW practical ... and very inexpensive ... is to extend the
range of scan clock on LCD. Maybe the only reason it isn't done is because
the engineers or product managers of LCDs are as closed minded as you are
about the realities of video? or maybe they (and you?) have a vested
interest in selling high end video cards?


|>The motion picture industry does not seem to agree with you.
|
| Untrue, their choice has no bearing except a lower limit,
| not an upper limit.

This is all about the lower limit. Maybe your position on this is because
you are getting things mixed up? Or maybe it is because you think that
what I want to do is not just lower the lower limit to at least as low as
23.976 Hz/fps, but to also lower the upper limit? No. I do NOT want to
lower the upper limit at all. If anything, I think it should be raised,
at least if it is not already capable of 120 Hz.

All that is needed in an LCD to make this work is to extend the range of
the analog to digital sampling clock. Frequency synthesizer circuits,
especially those that produce pulse or square wave output, can easily go
all the way down to near zero by just putting in a larger number in the
divider register. To extend these circuits for LCD purpose, they might
need to have 1 or 2 more bits added to the register size (basically, just
use an oscillator chip that has the wider range). Then don't impose any
such limit in the software of the display.


|>Of course
|>a motion picture projector would produce flicker if run at 24 shutter
|>openings per second. The figured out ages ago that by opening the
|>shutter 48 or 72 times a second, timed so that the openings were of the
|>same timing between them, where 2 or 3 openings would have the same
|>frame, that this eliminated the flicker (even though it did not change
|>the total accumulated time the shutter was open) for most eyes.
|>
|>Were it the case that motion that changed only every 1/24 of second
|>were annoying, too, they would have change the system to a higher frame
|>rate. They did for special cases (action sports), but not for the
|>average movie.
|
| These things take time, nothing starts out as good as it
| ends up in it's final state. For example we could say if HD
| resolution mattered then they'd have just done that in the
| first place too on all distribution content.

Things that are very expnsive to do will not get done as readily. It was
well known that high definition was better as early as the 1950's when it
was first experimented with for military use. It was even deployed in the
Vietnam war. It just wasn't practical for widespread use for many reasons,
including very high prices and extreme spectral bandwidth.

So your argue is not applicable because you are referring to something that
at the time had a very high cost resistance.


| People can in fact see interruption to fluid motion at
| 24FPS. IMO, the threshold is a little closer to 35FPS but
| of course will depend on one's visual acuity and alertness.

I do not disagree with the detection of fluid motion at 24 fps. I might
argue that the upper limit is even higher than 35 fps (if that is what you
meant), but that is another topic. But what is usually displayed on office
desktop screens is not in the "fluid motion" motion category. The ability
to display higher resolution is far more important than the ability to have
fluid non-jerky motion. Sure, both would be nice. But office environments
don't justify spending the extra to get both of those features since the
motion aspect is not nearly as important.

Ever wonder why widescreen computer displays are 16:10 instead of 16:9 as
has been decided for broadcast and other video programming content? The
answer is that it is the correct aspect ratio to fit TWO pages of text in
standard 8.5x11 or A4 paper sizes, side by side. But it takes really high
resolutions like 2560x1600 to display them reasonably readable.


|>|>What the TV world refers to as frame rate is what the computer world
|>|>refers to as vertical frequency. A "1080p24" HD TV signal has a frame
|>|>rate of 24 frames per second (or 1000/1001 of that rate if locked to
|>|>legacy NTSC reference clock timing), and that is 24 Hz or 23.976 Hz.
|>|
|>| This doesn't matter, it is only a lower limit. Slowing down
|>| one end to the slowest the other would ever require is a
|>| backwards concept. On a PC (thus using LCD PC monitor),
|>| both ends can support faster than that.
|>
|>Making a video card that has a faster pixel clock rate costs more
|>money.
|
| Wrong. Today the rate is already high enough and it would
| cost more money to make an additional, special downgraded
| version that could only meet your antiquated limit. Instead
| the most cost effective is to make video cards with large
| market appeal and capable of multiple purposes.

There are still low-end and high-end video cards on the market, where
the low-end ones still have clock speed limits. Why don't you go speak
to their engineers and ask them how much it would cost them to increase
the clock frequency on them to what the high-end cards have. It would
surely not cost as much as the high-end card (because we're leaving out
the 3D texture animation features and other motion stuff not needed in
an office environment).


| Even the cheapest integrated video today has no problem
| outputting 60Hz at HD resolution. It doesn't get any
| cheaper than that.

Identify ONE such card, which has working driver software for all major
operating systems, and open specifications to validate the software or
correct it where applicable (e.g. in open source systems). The most
modern card I know of that can be universally used is the Matrox G550.


|>| Never said they did, but one could call the jerkiness of
|>| 24Hz flicker-like as well. 24 FPS is visually noticable on
|>| an LCD.
|>
|>Tell me which LCD monitor you have seen 24 fps video displayed?
|>Manufacturer and model, please.
|
| You don't seem to understand. Every single LCD monitor on
| earth, the factor is not the monitor, it is that the subject
| watching can discern breaks in movement at a rate so low as
| 24 FPS. Maybe your eyes are bad and you can't see it, but
| again it would be a matter of degrading something to meet a
| bare minimum threshold instead of keeping it at current
| quality and striving for even better.

You don't seem to understand. This is not about movement. Sure, there
are uses for monitors where movement is a factor. But there are plenty
where it is not (office uses is the big example).

I can see the 24 fps effect. It is NOT annoying to me, as I want many
movies. Hint: even if you upscan the movie to "3:2 pulldone" as is done
to show 24 fps films on NTSC or 60 fps HD video, that jerkiness in the
motion is still present.

All movies shot at 24 fps will look NO BETTER at 60 Hz than at 24 Hz on
an LCD monitor. In fact, they may even look a tad bit WORSE because of
the fact that every other frame lasts a bit longer in time than the frames
between them (at 60 Hz, one frame is displayed 3 times, then the next is
displayed 2 times, then 3, then 2, and so on). You could, of course, up
the 24 fps movie to 48 Hz or 72 Hz to eliminate that effect. But you
could just as easily leave it at 24 fps and display it on an LCD since
LCD eliminates the flicker issue just as if you had a scan converter.


|>|>Some people _need_ high frame rates for legitimate purposes, such as
|>|>gaming, and watching high action video.
|>|
|>| Most people will benefit from more than 24FPS in typical
|>| motion video uses. Most people don't have the problem you
|>| suggest and attempt to solve by lowering the rate to 24.
|>
|>They can benefit by less costly hardware, whether older or newer, by running
|>the video at 24 fps when their type of use of the computer is compatible with
|>that slow frame rate.
|
| <sigh>
|
| I can see you still don't get it. That you have some
| ancient Maxtrox card incapable of current performance levels
| from even the cheapest hardware made in the last few
| generations/years, is no support for your argument. Such an
| ancient system that video card would be found in has many
| limitations besides what the video card can output including
| processing peformance, no DVD drive, ancient OS that may not
| even be able to run the modern software for playback, and
| would not typically have a modern high resolution LCD mated
| with it for use.
|
| Randomly selecting any cheap modern system there is no
| problem doing what you suggest is problematic. Try it
| sometime.

I can see you still don't get it. It's about making things work for what
is being used ... not for what _you_ happen to use video cards for (such
as maybe watching movies shot at 60 wasteful film frames per second).

Show me ONE video card you think I should use that universally works in
all software and systems. Hint: if it has always had open programming
specifications, it will. Those that release their specs today will
qualify in about a year or two.

This is the system:

Tyan S2927A2NRF mainboard (2x Gb ether, 6 SATA-II, on board)
2x AMD Opteron (model 2222) 3.0 GHz dual-core CPU
4GB DDR-2 667 MHz ECC RAM
Matrox Millennium G550 PCIe video
2x Seagate 750GB SATA hard drives
Gentoo Linux with 2.6.22.9 kernel

But let's also talk about LOW PRICE systems that still need to be used in
an office environment. *IF* LCD monitors would do the video scanning at a
_lower_ frequency (again ... this is not eliminating the ability to scan
at a higher frequency as well, or even by default), then a low cost HIGH
RESOLUTION office desk system can be achieved at a very low cost. Because
of that one limitation in LCD monitors, that does not cost more than about
three dollars to fix (and this would only need to be done in a few models so
_you_ would never have to pay for it), we can lower the total cost of an
office computer by a couple HUNDRED dollars and have higher resolution.

But I guess you'd rather they go ahead and get a high end computer system
with more costly video and software ... just so the secretaries won't have
to be annoyed watching jerky 24 fps when they goof off in the afternoon
watching the soap operas.
 
You are continuing to make assumptions. I do not have any such goal
to use "old junk". I use what works with all software.

yes you do insist on only considering old junk. Any video
card made by the major 2 (nVidia and ATI) in the last
several years can do 60Hz HD resolution video.


Define "a normal video card". One that costs $100?

One that costs $0 because it's integrated into a
motherboard. One that costs $0 because it was old AGP or
PCI tech and being thrown away due to being incompatible
with a new system. One that costs $10 on ebay. One that
costs $20 at a hardware surplus 'site, and certainly any of
the current generation cards you'd buy at any PC hardware
seller's website.


Sure, outputting at 60Hz is possible at some resolutions. But it is not
possible on all video cards at all resolutions.

I don't think you've ever tried.


This discussion is crazy. What you are implying is a
problem/won't work, is being done by plenty of people every
day. They aren't crying out for sub-60Hz rate because it
works already.
 
| On 4 Oct 2007 13:42:47 GMT, (e-mail address removed) wrote:
|
|
|>
|>You are continuing to make assumptions. I do not have any such goal
|>to use "old junk". I use what works with all software.
|
| yes you do insist on only considering old junk. Any video
| card made by the major 2 (nVidia and ATI) in the last
| several years can do 60Hz HD resolution video.

But those are not universally usable. ATI has begun to take some steps
to make _some_ of their cards _eventually_ more widely usable.

Try again.


|>
|>Define "a normal video card". One that costs $100?
|
| One that costs $0 because it's integrated into a
| motherboard. One that costs $0 because it was old AGP or
| PCI tech and being thrown away due to being incompatible
| with a new system. One that costs $10 on ebay. One that
| costs $20 at a hardware surplus 'site, and certainly any of
| the current generation cards you'd buy at any PC hardware
| seller's website.

Well that seems to include what you claim as "old junk".

If you find one that works universally, let me know. One that is limited
to a tiny set of drivers for just one class of usage is not universal.


|>Sure, outputting at 60Hz is possible at some resolutions. But it is not
|>possible on all video cards at all resolutions.
|
| I don't think you've ever tried.

I have not tried this on all video cards.

Any video card has limits. The limits keep going up (and the limits on
where the cards can be used keeps going down). There is a maximum vertical
geometry. There is a maximum horizontal geometry. A certain pixel clock
is needed to use ONE dimension to its extreme. An even higher pixel clock
is needed to use BOTH dimensions to their extreme at the same time to keep
the same frame rate. The alternative is accept the lower frame rate. That
was never an option CRT technology could choose (without the added cost of
adding on a overscanning buffer). However, it is a very low cost option
LCD technology could choose.


| This discussion is crazy. What you are implying is a
| problem/won't work, is being done by plenty of people every
| day. They aren't crying out for sub-60Hz rate because it
| works already.

Actually, people are complaining at a higher level. They don't know the
details of the issue they have. They merely complain things like Linux
does not work, when in fact the real problem is that the drivers for their
video care were developed by incompetent (cheap, hired at lowest bid price)
progammers. Some of them do find out that some other video cards (not ATI
and nVidia) do work, but don't have the extreme pixel clock rates that
allow the extended geometry AND the 50 Hz frame rate at the same time.

If you want to figure out how to get things to work, be my guest. I have
identified the low cost option, which is to find an LCD monitor that does
handle 24 fps. Given that this video card mess is NOT the ONLY source of
24 fps video (TV itself is a source of 23.976 or 24 Hz at 1280x720 or at
1920x1080 for video directly encoded from motion picture films at higher
definition), there is good cause to have such LCD units somewhere.

Maybe the frequency generator chip in the monitor can go lower. Maybe it
is just software that is deciding "the vertical is below 50 Hz, so shut off
the video". Maybe they are just using leftover software that previous was
running a CRT? More plausibly, they are trying to "protect me" from having
a video mode that won't work if I switch back to a CRT (which I won't if I
get LCD to work well, and even with the minimal geometry, it does OK).
 
| On 4 Oct 2007 13:42:47 GMT, (e-mail address removed) wrote:
|
|
|>
|>You are continuing to make assumptions. I do not have any such goal
|>to use "old junk". I use what works with all software.
|
| yes you do insist on only considering old junk. Any video
| card made by the major 2 (nVidia and ATI) in the last
| several years can do 60Hz HD resolution video.

But those are not universally usable. ATI has begun to take some steps
to make _some_ of their cards _eventually_ more widely usable.

Try again.

Nonsense. World plus dog manages to get the job done with
off the shelf hardware when the requirement is merely
outputting 60Hz. Nobody needs to find a 24Hz refresh rate
monitor. Maybe they need a video card made in this century,
but so do they also need a computer made in the past 10
years or so if they're wanting to use it for HD content.
 
| On 4 Oct 2007 17:41:46 GMT, (e-mail address removed) wrote:
|
|>| On 4 Oct 2007 13:42:47 GMT, (e-mail address removed) wrote:
|>|
|>|
|>|>
|>|>You are continuing to make assumptions. I do not have any such goal
|>|>to use "old junk". I use what works with all software.
|>|
|>| yes you do insist on only considering old junk. Any video
|>| card made by the major 2 (nVidia and ATI) in the last
|>| several years can do 60Hz HD resolution video.
|>
|>But those are not universally usable. ATI has begun to take some steps
|>to make _some_ of their cards _eventually_ more widely usable.
|>
|>Try again.
|>
|
| Nonsense. World plus dog manages to get the job done with
| off the shelf hardware when the requirement is merely
| outputting 60Hz. Nobody needs to find a 24Hz refresh rate
| monitor. Maybe they need a video card made in this century,
| but so do they also need a computer made in the past 10
| years or so if they're wanting to use it for HD content.

Nobody knows the solution exists. Spend $3 more on a better LCD design
and save as much as $200 on computer hardware. This would primarily be
for business office use, as they are where most computer usage that can
get by with 24 fps exists.

But I started thinking about this out of the box. We're doing it all wrong.

LCD effectively has a form of persistent memory that can be written to, to
change what is displayed. Thus there really is no reason, anymore, to be
sending the same picture over and over and over. What I mean is that the
whole concept of raster scan really doesn't mean anything to LCD other than
what the electronics design chooses to have it mean.

What should be done is to have a new video protocol that transmits blocks
of video that identify the portion of the screen to update, and all the
pixel values that go there. Then the video controller card can send the
image updates however it likes. It could still update everything in a
legacy raster scan. But the advantage will be realized by being able to
update only the portions that actually change, when they change. So when
you open a new window on your dekstop, the act of it opening would send a
block identifying the reference point of the window, its size, and its
contents. The video card would detect this based on what was written to
video memory as it happens.

This would require a digital link; analog would be no good for it. And we
already have a protocol that does this (over a network) that could be used.
It is called VNC. A monitor could then be connected by DVI or equivalent
for the ulimate in speed, or via a network through CAT5 to any VNC based
computer or KVM switch. Other options include USB or Fireware as the link
layer (which could also include keyboard and mouse which could plug into
the monitor instead of the computer, or use infra-red.
 
| On 4 Oct 2007 17:41:46 GMT, (e-mail address removed) wrote:
|
|>| On 4 Oct 2007 13:42:47 GMT, (e-mail address removed) wrote:
|>|
|>|
|>|>
|>|>You are continuing to make assumptions. I do not have any such goal
|>|>to use "old junk". I use what works with all software.
|>|
|>| yes you do insist on only considering old junk. Any video
|>| card made by the major 2 (nVidia and ATI) in the last
|>| several years can do 60Hz HD resolution video.
|>
|>But those are not universally usable. ATI has begun to take some steps
|>to make _some_ of their cards _eventually_ more widely usable.
|>
|>Try again.
|>
|
| Nonsense. World plus dog manages to get the job done with
| off the shelf hardware when the requirement is merely
| outputting 60Hz. Nobody needs to find a 24Hz refresh rate
| monitor. Maybe they need a video card made in this century,
| but so do they also need a computer made in the past 10
| years or so if they're wanting to use it for HD content.

Nobody knows the solution exists. Spend $3 more on a better LCD design
and save as much as $200 on computer hardware. This would primarily be
for business office use, as they are where most computer usage that can
get by with 24 fps exists.


Where is the $200 savings? Practically free integrated
video can already do this, and it is not as though a really
old system otherwise has the processing muscle to handle HD
decompression.


But I started thinking about this out of the box. We're doing it all wrong.

LCD effectively has a form of persistent memory that can be written to, to
change what is displayed. Thus there really is no reason, anymore, to be
sending the same picture over and over and over. What I mean is that the
whole concept of raster scan really doesn't mean anything to LCD other than
what the electronics design chooses to have it mean.

While it is true the image won't have to be resent so long
as it remains static, these parts are most cost effecitvely
made towards higher, most versatile standards. 60Hz could
display 24FPS fine but not the other way around, and since
any practical video card can also do this, I don't see why
you make it out to be something that needs change, when it
already works with even the lowest end components made
today, while anything changed would be yet again a new part
needed only to solve a problem that no longer exists.


What should be done is to have a new video protocol that transmits blocks
of video that identify the portion of the screen to update, and all the
pixel values that go there. Then the video controller card can send the
image updates however it likes. It could still update everything in a
legacy raster scan. But the advantage will be realized by being able to
update only the portions that actually change, when they change. So when
you open a new window on your dekstop, the act of it opening would send a
block identifying the reference point of the window, its size, and its
contents. The video card would detect this based on what was written to
video memory as it happens.

Why change what already works?
I encourage you to hook a monitor capable of the resolution
up to any lowest-end modern video card. Try it. Plenty of
people do watch HD this way.
 
| On 4 Oct 2007 17:41:46 GMT, (e-mail address removed) wrote:
|
|>| On 4 Oct 2007 13:42:47 GMT, (e-mail address removed) wrote:
|>|
|>|
|>|>
|>|>You are continuing to make assumptions. I do not have any such goal
|>|>to use "old junk". I use what works with all software.
|>|
|>| yes you do insist on only considering old junk. Any video
|>| card made by the major 2 (nVidia and ATI) in the last
|>| several years can do 60Hz HD resolution video.
|>
|>But those are not universally usable. ATI has begun to take some steps
|>to make _some_ of their cards _eventually_ more widely usable.
|>
|>Try again.
|>
|
| Nonsense. World plus dog manages to get the job done with
| off the shelf hardware when the requirement is merely
| outputting 60Hz. Nobody needs to find a 24Hz refresh rate
| monitor. Maybe they need a video card made in this century,
| but so do they also need a computer made in the past 10
| years or so if they're wanting to use it for HD content.

Nobody knows the solution exists. Spend $3 more on a better LCD design
and save as much as $200 on computer hardware. This would primarily be
for business office use, as they are where most computer usage that can
get by with 24 fps exists.

But I started thinking about this out of the box. We're doing it all wrong.

LCD effectively has a form of persistent memory that can be written to, to
change what is displayed. Thus there really is no reason, anymore, to be
sending the same picture over and over and over. What I mean is that the
whole concept of raster scan really doesn't mean anything to LCD other than
what the electronics design chooses to have it mean.

What should be done is to have a new video protocol that transmits blocks
of video that identify the portion of the screen to update, and all the
pixel values that go there.

Sprites, anyone? ;-)

I think what you are proposing is a smart display, one that performs
many of the operations now done in software (ie, CPU) and video hardware
(ie, GPU). I think it's a good concept. It would really, really speed up
games. Would make humungous screens possible. 16,0000x90,000? Wowser!

[...]
 
| On 5 Oct 2007 04:25:58 GMT, (e-mail address removed) wrote:
|
|>| On 4 Oct 2007 17:41:46 GMT, (e-mail address removed) wrote:
|>|
|>|>| On 4 Oct 2007 13:42:47 GMT, (e-mail address removed) wrote:
|>|>|
|>|>|
|>|>|>
|>|>|>You are continuing to make assumptions. I do not have any such goal
|>|>|>to use "old junk". I use what works with all software.
|>|>|
|>|>| yes you do insist on only considering old junk. Any video
|>|>| card made by the major 2 (nVidia and ATI) in the last
|>|>| several years can do 60Hz HD resolution video.
|>|>
|>|>But those are not universally usable. ATI has begun to take some steps
|>|>to make _some_ of their cards _eventually_ more widely usable.
|>|>
|>|>Try again.
|>|>
|>|
|>| Nonsense. World plus dog manages to get the job done with
|>| off the shelf hardware when the requirement is merely
|>| outputting 60Hz. Nobody needs to find a 24Hz refresh rate
|>| monitor. Maybe they need a video card made in this century,
|>| but so do they also need a computer made in the past 10
|>| years or so if they're wanting to use it for HD content.
|>
|>Nobody knows the solution exists. Spend $3 more on a better LCD design
|>and save as much as $200 on computer hardware. This would primarily be
|>for business office use, as they are where most computer usage that can
|>get by with 24 fps exists.
|
|
| Where is the $200 savings? Practically free integrated
| video can already do this, and it is not as though a really
| old system otherwise has the processing muscle to handle HD
| decompression.

I should have said save as much as $200 on computer hardware AND software.

HD decompression is not involved in an office desktop environment.
You're still thinking in terms of a home media center.


|>But I started thinking about this out of the box. We're doing it all wrong.
|>
|>LCD effectively has a form of persistent memory that can be written to, to
|>change what is displayed. Thus there really is no reason, anymore, to be
|>sending the same picture over and over and over. What I mean is that the
|>whole concept of raster scan really doesn't mean anything to LCD other than
|>what the electronics design chooses to have it mean.
|
| While it is true the image won't have to be resent so long
| as it remains static, these parts are most cost effecitvely
| made towards higher, most versatile standards. 60Hz could
| display 24FPS fine but not the other way around, and since
| any practical video card can also do this, I don't see why
| you make it out to be something that needs change, when it
| already works with even the lowest end components made
| today, while anything changed would be yet again a new part
| needed only to solve a problem that no longer exists.

Actually, 60 Hz degrades 24 fps video. Recording 24 fps in a 24 fps format,
or transmitting it as such over ATSC, renders motion more correctly. For
CRTs, which must upconvert 24 fps, doing so to 48 or 72 Hz instead of 60 Hz
would avoid the problem.

http://en.wikipedia.org/wiki/Telecine#Telecine_judder


|>What should be done is to have a new video protocol that transmits blocks
|>of video that identify the portion of the screen to update, and all the
|>pixel values that go there. Then the video controller card can send the
|>image updates however it likes. It could still update everything in a
|>legacy raster scan. But the advantage will be realized by being able to
|>update only the portions that actually change, when they change. So when
|>you open a new window on your dekstop, the act of it opening would send a
|>block identifying the reference point of the window, its size, and its
|>contents. The video card would detect this based on what was written to
|>video memory as it happens.
|
| Why change what already works?

It's a choice between progress or stagnantion.


| I encourage you to hook a monitor capable of the resolution
| up to any lowest-end modern video card. Try it. Plenty of
| people do watch HD this way.

Watching HD TV programs is not my interest with regard to the computer.
I speak of 24 fps movies because that is an issue that should _also_ be
addressed.

Let me know of a particular lowest-end modern video card that works on all
software, if you know of one. I've not yet found one. The last one I have
seen is the Matrox G550 (not low-end, but most certainly works universally).
 
| (e-mail address removed) wrote:
|> | On 4 Oct 2007 17:41:46 GMT, (e-mail address removed) wrote:
|> |
|> |>| On 4 Oct 2007 13:42:47 GMT, (e-mail address removed) wrote:
|> |>|
|> |>|
|> |>|>
|> |>|>You are continuing to make assumptions. I do not have any such goal
|> |>|>to use "old junk". I use what works with all software.
|> |>|
|> |>| yes you do insist on only considering old junk. Any video
|> |>| card made by the major 2 (nVidia and ATI) in the last
|> |>| several years can do 60Hz HD resolution video.
|> |>
|> |>But those are not universally usable. ATI has begun to take some steps
|> |>to make _some_ of their cards _eventually_ more widely usable.
|> |>
|> |>Try again.
|> |>
|> |
|> | Nonsense. World plus dog manages to get the job done with
|> | off the shelf hardware when the requirement is merely
|> | outputting 60Hz. Nobody needs to find a 24Hz refresh rate
|> | monitor. Maybe they need a video card made in this century,
|> | but so do they also need a computer made in the past 10
|> | years or so if they're wanting to use it for HD content.
|>
|> Nobody knows the solution exists. Spend $3 more on a better LCD design
|> and save as much as $200 on computer hardware. This would primarily be
|> for business office use, as they are where most computer usage that can
|> get by with 24 fps exists.
|>
|> But I started thinking about this out of the box. We're doing it all wrong.
|>
|> LCD effectively has a form of persistent memory that can be written to, to
|> change what is displayed. Thus there really is no reason, anymore, to be
|> sending the same picture over and over and over. What I mean is that the
|> whole concept of raster scan really doesn't mean anything to LCD other than
|> what the electronics design chooses to have it mean.
|>
|> What should be done is to have a new video protocol that transmits blocks
|> of video that identify the portion of the screen to update, and all the
|> pixel values that go there.
|
| Sprites, anyone? ;-)
|
| I think what you are proposing is a smart display, one that performs
| many of the operations now done in software (ie, CPU) and video hardware
| (ie, GPU). I think it's a good concept. It would really, really speed up
| games. Would make humungous screens possible. 16,0000x90,000? Wowser!

The thing is, it's not much different than what the video card does now.
The memory just moves from the card to the display. Of course things like
3D and texture rendering will either need to still be done on the video
card and the results send as a block to the display, or the display itself
will have to have a powerful GPU with extensions to the protocol to send
the rendering requests.

I would think we'll top out on resolution for personal viewing somewhere
around 10368x4320 :-)
 
| Where is the $200 savings? Practically free integrated
| video can already do this, and it is not as though a really
| old system otherwise has the processing muscle to handle HD
| decompression.

I should have said save as much as $200 on computer hardware AND software.

So what costs $200?
HD decompression is not involved in an office desktop environment.
You're still thinking in terms of a home media center.

What's the point then? If the office isn't doing any HD,
it'll be fine at 60Hz with integrated video or low end video
card (unless a special need besides running at 60Hz
refresh).

However, if the office desktop needed to play back HD
content on a 60Hz LCD, it can do that as well.

Actually, 60 Hz degrades 24 fps video.

Far less than that it was 24FPS in the first place instead
of higher framerate.
Recording 24 fps in a 24 fps format,
or transmitting it as such over ATSC, renders motion more correctly.

More correctly than what? We're talking about a PC, we
don't have these limitations.
For
CRTs, which must upconvert 24 fps, doing so to 48 or 72 Hz instead of 60 Hz
would avoid the problem.

http://en.wikipedia.org/wiki/Telecine#Telecine_judder


The problem is the 24FPS rate, not 2X that rate or higher.
If you can't perceive the 24FPS lag, it is unreasonable to
claim a shorter duration lag is then important.

|>What should be done is to have a new video protocol that transmits blocks
|>of video that identify the portion of the screen to update, and all the
|>pixel values that go there. Then the video controller card can send the
|>image updates however it likes. It could still update everything in a
|>legacy raster scan. But the advantage will be realized by being able to
|>update only the portions that actually change, when they change. So when
|>you open a new window on your dekstop, the act of it opening would send a
|>block identifying the reference point of the window, its size, and its
|>contents. The video card would detect this based on what was written to
|>video memory as it happens.
|
| Why change what already works?

It's a choice between progress or stagnantion.

yes, going to extra effort to reduce LCD monitor refresh
rates would be stagnation. Progress is moving higher than
24FPS.


| I encourage you to hook a monitor capable of the resolution
| up to any lowest-end modern video card. Try it. Plenty of
| people do watch HD this way.

Watching HD TV programs is not my interest with regard to the computer.
I speak of 24 fps movies because that is an issue that should _also_ be
addressed.

Let me know of a particular lowest-end modern video card that works on all
software, if you know of one. I've not yet found one. The last one I have
seen is the Matrox G550 (not low-end, but most certainly works universally).


What are you talking about "software"?
A video card doesn't need to support software, besides
having an OS driver. You could get lower CPU utilization
if the video card driver plus some software titles support
hadware overlays, and MPEG2, 4/2, or 4/10 acceleration, but
that does not keep it from "working on all software" unless
you have a very strange definition of working?

What is it you claim makes a G550, "work"?
 
I think what you are proposing is a smart display, one that performs
many of the operations now done in software (ie, CPU) and video hardware
(ie, GPU). I think it's a good concept. It would really, really speed up
games. Would make humungous screens possible. 16,0000x90,000? Wowser!

[...]


Processing by CPU and video card is much higher bandwidth
than outputting the finished frames, for example video cards
often have roughly 5 to over 30GB/s.
 
kony said:
I think what you are proposing is a smart display, one that performs
many of the operations now done in software (ie, CPU) and video hardware
(ie, GPU). I think it's a good concept. It would really, really speed up
games. Would make humungous screens possible. 16,0000x90,000? Wowser!

[...]


Processing by CPU and video card is much higher bandwidth
than outputting the finished frames, for example video cards
often have roughly 5 to over 30GB/s.


I take it you mean that putting a GPU in the monitor doesn't promise
enough of an advantage to make it worthwhile. I agree that's so for
current HD standards and monitors. But suppose a 120 x 67.5 inch monitor
with a .25mm pixel pitch. That would be 12192 x 6858 resolution at 16:9.
Call it Ultra-HD. It would just fit at one end of my living room....

And I could fit a Cinemascope proportioned screen (18242 x 6858) on the
long wall of the basement rec room....

Sigh.
 
Back
Top