Observations on a UPS - follow up to a previous post

  • Thread starter Thread starter Doc
  • Start date Start date
In comp.sys.ibm.pc.hardware.misc William Sommerwerck




Very much so, but mostly in the same sense that a glowing
pice of metal is a quantum device....

Arno

No, not at all in the same *sense*, but to the same *degree*. The
fluorescents use electron transitions in Hg to generate a line
spectrum, and then a fluorescent coating inside the bulb to convert the
lines in question into new lines and bands. This is in no way
comparable to black-body radiation, which is a continuum. This, of
course, is an *opinion* :-)

However, both are still quantum-mechanical devices at bottom. In fact,
it has been said that it was trying to solve the BB radiation problem
that led Planck to the discovery of his constant: he took the limit of
something as some differential went to zero and it didn't work. But he
got the right answer when he set the differential to a finite value,
around 6.27E-27, IIRC.

Close (sort of): Google gives me "Planck's constant = 6.626068 × 10-34
m2 kg / s", so I left out the second 6 - but I am used to it in cgs,
rather than mks, so the exponent is correct. That *would* be more
believable if I had expressed the units, erg-sec, above :-)
 
Virtually all CFLs are low power factor, HPF would add to the cost and
people wouldn't buy them.

In the US, domestic electric meters measure true power, so the power factor
doesn't make any difference in the bill, it does strain the distribution
system more though.
 
kony said:
Many people in their daily use cannot see any lag or
ghosting from 19" and smaller LCD computer monitors.

If you can't actually see it, does it matter if it exists?
I can play 50 FPS video or games running at over 50 FPS on a
19" LCD computer monitor and not see any problems except the
obvious lack of contrast (but with CRT I am spoiled in this
respect, having bought Diamondtron tube based monitors for
the last few I used myself before switching to primarily LCD
usage).


I sure can, maybe my eyes are just better than average, there's those
"golden ear" audiophools I always thought were nuts, but maybe some of them
aren't as nutty as I thought. I've got a high end 20" flat panel on my desk
at work, it looks really good, but still not as good as the 22" flat
Trinitron CRT I have at home. Geometry is flawless, but the picture doesn't
look as smooth and clean as the CRT, it looks more "digital".
 
I sure can, maybe my eyes are just better than average, there's those
"golden ear" audiophools I always thought were nuts, but maybe some of them
aren't as nutty as I thought. I've got a high end 20" flat panel on my desk
at work, it looks really good, but still not as good as the 22" flat
Trinitron CRT I have at home. Geometry is flawless, but the picture doesn't
look as smooth and clean as the CRT, it looks more "digital".

I did not write "some LCD", I wrote about current generation
19" and lower.

It doesn't matter if you see ghosting on 20"+, for the
purpose of the discusstion which is whether smaller
comparable resolutions exhibit it.

If we were taking about higher resolutions than native to
19", then CRTs lose on another front because their refresh
rate and pixel boundaries get so blurred it is no longer an
accurate output.

Looking more "digital" is not necessarily a flaw. A video
card does not transmit an infinitely high res, flawless
image, it transmits pixels. Accurately representing those
pixels is the monitor's job, not blurring them so they look
more lifelike.
 
James Sweet said:
I sure can, maybe my eyes are just better than average, there's those
"golden ear" audiophools I always thought were nuts, but maybe some of
them aren't as nutty as I thought. I've got a high end 20" flat panel on
my desk at work, it looks really good, but still not as good as the 22"
flat Trinitron CRT I have at home. Geometry is flawless, but the picture
doesn't look as smooth and clean as the CRT, it looks more "digital".
Hi James, goes it well ? Yes, that about says it. Perhaps it is just the
level of discernment, and it *is* just us, but that doesn't explain how my
wife thinks that the pictures are 'fuzzy' when anything is moving on them,
but makes no such comment when watching our 34" CRT Tosh TV, or the cinema
when it's projected filmstock, rather than a DLP video projector. She has no
technical axe to grind, as it were, and is interested in the picture only
for its entertainment value. Since I have had this high-end HP widescreen
LCD on the computer, which she also uses, she has made little comment other
than it looks "nice", which is true on the typically stationary pictures
that are normally displayed on it. I have, however, heard her comment that
the pictures on my son's (equally high-end) HP LCD are "out of focus", and
that would be typically when he is playing a game. Being non-technical, "out
of focus" is the best description that she can come up with for 'motion
blur'

Arfa
 
Rheostats? Triacs?

Something solid state and cheap. Probably triacs.
I'm not familiar with the current technology. (X-10 is
triac-controlled, I believe.)

Something like triacs, I'm sure.

BTW I've found that there is now a middle ground in lighting, now that DMX
hardware is so deadly cheap. I've paid as little as $39.95 (sale) for a quad
dimmer pack that can handle the full load that a residential lighting
circuit can handle. There's a lot to be said for low voltage copper control
wiring.
Actually, I didn't say that, but you'd expect it to be so, given that a
fluorescent lamp is a quantum device.
Exactly.
That's what I noted with the Home Despot lamps.

The extended warm-up can be a bit disconcerting. Some bulbs can start out
pretty dim. They are lit OK, but they are pretty dim.
It was startling at first to
see a fluorescent lamp come on faster than an incandescent.

Try a CFL on a really cold Michigan morning, say in an unheated garage. ;-(
 
While they start pretty much on the dime, they do get about
The extended warm-up can be a bit disconcerting. Some bulbs
can start out pretty dim. They are lit OK, but they are pretty dim.

My first CFLs were Philips, and they took "forever" to come to a reasonable
brightness, let alone full. But the Home Despot cheapies are quite bright
from the moment they're enerigized, and take only about a minute to real
full brilliance.

Try a CFL on a really cold Michigan morning, say in an unheated
garage. ;-(

If I don't find a job soon, I might very well be living in an unheated
garage, come Christmas.
 
Recently I asked about suggestions regarding a UPS. I ended up
getting an 875 VA 525 Watt "Geek Squad" model from Best Buy - yeah,
yeah, everyone says Geek Squad stuff is overhyped junk, but at $69 on
sale, the price seemed right.

It seems to handle my 2 computers fine - a PIV 2.4 gig and a PIII 933
mhz sharing a monitor. With both machines and the monitor on, the
onboard readout shows them well below the unit's max capacity, drawing
about .250 - .260 kw (which I assume translates to 250 - 260 watts) ,
with an estimated run time of 9 minutes with both computers. More than
enough to get me through short hit outages with both machines running.

Interesting to note how much of a difference the monitor makes.
Without the monitor - a 17" MAG CRT, the draw for both computers
drops under 200 watts and the estimate run time for the 2 computers
goes from 9 mins to 15mins. Over 20 mins with just one computer
running but no monitor.

Since this thing has a built-in watt usage meter, any reason I
couldn't hook it up to say a refrigerator or TV to check how much
wattage they're using?

I've a 600VA unit and it will also run 2 PCs with their monitors.
When the battery went dead I connected a 92 Ah lead acid battery as a
replacement and has been fine since. I did not like the idea to test a
discharge to see how long it would run but for sure much longer than
the 7Ah original battery. The UPS keeps a float charge at around 13.5V
which is fine for this battery, but I may at some day apply an
equalizing charge for better maintenance. Running just one PC the
inverter transistors don't get too hot I think it could hold a long
time, I've the power guaranteed :)
 
kony said:
I did not write "some LCD", I wrote about current generation
19" and lower.

It doesn't matter if you see ghosting on 20"+, for the
purpose of the discusstion which is whether smaller
comparable resolutions exhibit it.

If we were taking about higher resolutions than native to
19", then CRTs lose on another front because their refresh
rate and pixel boundaries get so blurred it is no longer an
accurate output.

Looking more "digital" is not necessarily a flaw. A video
card does not transmit an infinitely high res, flawless
image, it transmits pixels. Accurately representing those
pixels is the monitor's job, not blurring them so they look
more lifelike.

Only if it has DVI output, and you are making use of it. Many video cards
still in common use, output three analogue waveforms created by hi -speed
DACs with at least 16 bit inputs, via the VGA output socket, which the
monitor, CRT or LCD, displays via pixels made up either from phosphor
triads, or LC cells. As we live in an analogue world, I fail to see how you
can contend that something which looks "more digital" is not flawed. If the
display looks anything different from how the real world looks, then it is
an inaccurate representation, which by definition, makes it flawed. If the
CRT display does anything to make the picture look closer to reality, then
that must make it more accurate, and thus less flawed.

I'm not too sure why you feel that a CRT monitor's refresh rate has any
impact on the accuracy of the displayed rendition of the input data. High
refresh rates are a necessity to facilitate high resolutions. The response
times of the phosphors are plenty short enough for this to not represent a
problem. I do not understand what you mean by a CRT's pixel boundaries (?)
getting blurred, and how that fits in with refresh rate.

The last thing that you say is a very odd statement. If the CRT monitor does
anything to make the image more lifelike, how do you make that out to be a
bad thing? By logical deduction, if any display technology reproduces the
data being sent to it more accurately than any other, and this actually
looks less lifelike than reality, then the data being sent must be
inaccurate, and thus flawed ...

Arfa
 
Virtually all CFLs are low power factor, HPF would add to the cost and people
wouldn't buy them.

In the US, domestic electric meters measure true power, so the power factor
doesn't make any difference in the bill, it does strain the distribution
system more though.

I was a little startled - you answered my post, clipped my (admittedly
silly) remark, and went on to actually answer the previous post.

I've never made a mistake like that (you can be forgiven for not
believing that!).
 
kony said:
I did not write "some LCD", I wrote about current generation
19" and lower.

It doesn't matter if you see ghosting on 20"+, for the
purpose of the discusstion which is whether smaller
comparable resolutions exhibit it.

If we were taking about higher resolutions than native to
19", then CRTs lose on another front because their refresh
rate and pixel boundaries get so blurred it is no longer an
accurate output.

Looking more "digital" is not necessarily a flaw. A video
card does not transmit an infinitely high res, flawless
image, it transmits pixels. Accurately representing those
pixels is the monitor's job, not blurring them so they look
more lifelike.

Well the LCD I have at work runs 1600x1200 native, the same as I run my CRT
at. This whole discussion is really moot, the CRT looks better to *me* and
that's all that matters, I don't care what the specs say or what others
claim. *I* see/notice the disadvantages of LCD panels, they bother *me*, and
therefore *I* prefer a good CRT. If you prefer a flat panel, then get one,
but this is a personal preference.

I want the image to look lifelike, the CRT does a good job of that, what do
I care if that's not the "monitor's job"?
 
I was a little startled - you answered my post, clipped my (admittedly
silly) remark, and went on to actually answer the previous post.

I've never made a mistake like that (you can be forgiven for not believing
that!).

I clipped the bottom, answered the part I was interested in, then simply
forgot to clip the top as well. So what?
 
My first CFLs were Philips, and they took "forever" to come to a reasonable
brightness, let alone full. But the Home Despot cheapies are quite bright
from the moment they're enerigized, and take only about a minute to real
full brilliance.


Depending on how long ago you tried those "first" CFLs, it
may be a inappropriate comparison. Even the generic
off-brands were poor at first and evolved over time.
 
Only if it has DVI output, and you are making use of it.

False, while DVI is certainly better the higher the
resolution, it is a separate factor.
Many video cards
still in common use, output three analogue waveforms created by hi -speed
DACs with at least 16 bit inputs, via the VGA output socket, which the
monitor, CRT or LCD, displays via pixels made up either from phosphor
triads, or LC cells. As we live in an analogue world, I fail to see how you
can contend that something which looks "more digital" is not flawed.

It's pretty easy to understand once you realize that the
picture the video card is attempted to display that was
generated by the OS, IS DIGITAL. Anyone knows that
conversion back and forth between digital and analog causes
loss (to whatever extent, which must be a large extent if
you deem the conversion to change the image enough that you
feel it's better somehow).
If the
display looks anything different from how the real world looks, then it is
an inaccurate representation,

WRONG. An accurate representation is to preserve as much of
the input information as possible, not burring it so that it
becomes in some way closer to smooth but simultaneously
losing information in the process, becoming less detailed.

If all you want is blurry, smear some bacon grease on your
screen!

Sorry but you are 100% wrong.
 
My first CFLs were Philips, and they took "forever" to come
Depending on how long ago you tried those "first" CFLs, it
may be a inappropriate comparison. Even the generic
off-brands were poor at first and evolved over time.

It wasn't intended as a comparison, but a contrast. (Ask any English
teacher.) And the Philips were indeed early CFLs.
 
kony said:
False, while DVI is certainly better the higher the
resolution, it is a separate factor.


It's pretty easy to understand once you realize that the
picture the video card is attempted to display that was
generated by the OS, IS DIGITAL. Anyone knows that
conversion back and forth between digital and analog causes
loss (to whatever extent, which must be a large extent if
you deem the conversion to change the image enough that you
feel it's better somehow).


WRONG. An accurate representation is to preserve as much of
the input information as possible, not burring it so that it
becomes in some way closer to smooth but simultaneously
losing information in the process, becoming less detailed.

If all you want is blurry, smear some bacon grease on your
screen!

Sorry but you are 100% wrong.

Well I'm sorry too, but it is you who is wrong. You would be right if we
were talking a signal that was being converted back and forth between types
or standards, but in the case of a computer generated picture, we are not.
We are talking a digitally created image of something that needs to be an
analogue one for our eyes to see. Whether the conversion from digital to
analogue takes place at the video card, or at the face of the monitor, it is
still a necessity that it takes place. The ultimate goal is to make it look
as lifelike as possible. If you think that by making it look sharper or in
some way different (or in your opinion, better) than real life, then you
have a very odd understanding of what the word 'accuracy' means in this
context.

Bacon grease ?? What a silly thing to throw into a discussion.

And what does your declaration of "false" about DVI mean ? If you want to
talk card-outputted 'pixels' then you need to be talking digital, which is
what a DVI output is. Otherwise, it's analogue as close as doesn't matter,
from the VGA socket.

And there's no need to shout by capitalization. I am neither deaf nor stupid
.... d;~}

Arfa
 
Arfa said:
And there's no need to shout by capitalization. I am neither deaf nor stupid
... d;~}


Just grouchy, some days? (Like the rest of us.) ;-)


--
Service to my country? Been there, Done that, and I've got my DD214 to
prove it.
Member of DAV #85.

Michael A. Terrell
Central Florida
 
Well I'm sorry too, but it is you who is wrong. You would be right if we
were talking a signal that was being converted back and forth between types
or standards, but in the case of a computer generated picture, we are not.
We are talking a digitally created image of something that needs to be an
analogue one for our eyes to see. Whether the conversion from digital to
analogue takes place at the video card, or at the face of the monitor, it is
still a necessity that it takes place. The ultimate goal is to make it look
as lifelike as possible. If you think that by making it look sharper or in
some way different (or in your opinion, better) than real life, then you
have a very odd understanding of what the word 'accuracy' means in this
context.

Bacon grease ?? What a silly thing to throw into a discussion.


It is your goal to blur the information, which is what the
grease would do.

Pixel data is output by a computer to a video card. Since
human vision has far higher granularity, it is not expected
to look like reality except to the depth of granularity
possible by that pixel data, resolution. If the pixel data
is not preserved but rather smoothed to reduce your
perception of the pixels, it is also removing "data" from
the image, it is less accurate than the output was intended
to be. Monitor manufacturers strive to accurately reproduce
the image, not make it asthetically pleasing.

The goal is accuracy, not "lifelike". Lifelike and accuracy
can coexist but it will come from higher resolution, not
degradation of the signal upon output as you propose.
 
kony said:
Capitalization is also used in text for emphasis, not just
shouting.

More commonly, on usenet, leading and trailing asterisks indicate what
would be italicized for emphasis.
 
Back
Top