Powermanement for computermonitors

  • Thread starter Thread starter Halfgaar
  • Start date Start date
H

Halfgaar

Hi,

I've got a question about powermanagement for monitors.

Turning a device on is harmful, because of the power surge it creates,
right? If that is so, then is it such a good idea to use powermanagement,
for computermonitors for example?

TIA

Halfgaar
 
Hi,
I've got a question about powermanagement for monitors.

Turning a device on is harmful, because of the power surge it creates,
right? If that is so, then is it such a good idea to use powermanagement,
for computermonitors for example?

TIA

It does not mater. How many times do you turn your TV off and on in a year
?
Has it went out yet ?
 
Ralph said:
It does not mater. How many times do you turn your TV off and on in a
year ?
Has it went out yet ?

True. But I've been told that switching on a device is equal to 10 hours of
usage. If that's true, I don't like switching it on and off all the time.

Halfgaar
 
To understand 'power cycling verses leaving power on',
first consult manufacturer datasheets. For example a power
switch has a life expectancy of (typically) 100,000 cycles.
Clearly power cycling a switch is far more destructive than
leaving it on. Lets see. Power cycling seven times every day
for ... 39 years.

Another device that has a particularly small 'power cycle'
life expectancy is one particular IBM hard drive - 40,000
cycles. That is seven times every day for ... 15 years.

The idea that power cycling shortens life expectancy is
correct - until we apply engineering numbers and put those
numbers in perspective. Then those power cycling worries
belong in the myth category. Some devices can have a shorted
life expectancy such as that power switch and that disk
drive. But who cares? Once numbers are applied, then reality
takes on a whole different perspective.

Some components, such as CPU are power cycling most severely
when in normal operation. Did they forget to mention that?
If power cycling was so destructive to a computer, then it is
also so destructive to a TV. You have numbers from
datasheets. If power cycling shortens a computer's life
expectancy by a factor of ten, well, who cares if the computer
is still working 150 years from now.

As Ralph Mowery has so accurately noted - it does not matter
- once engineering numbers and facts are applied.
 
w_tom said:
To understand 'power cycling verses leaving power on',
first consult manufacturer datasheets. For example a power
switch has a life expectancy of (typically) 100,000 cycles.
Clearly power cycling a switch is far more destructive than
leaving it on. Lets see. Power cycling seven times every day
for ... 39 years.

Another device that has a particularly small 'power cycle'
life expectancy is one particular IBM hard drive - 40,000
cycles. That is seven times every day for ... 15 years.

The idea that power cycling shortens life expectancy is
correct - until we apply engineering numbers and put those
numbers in perspective. Then those power cycling worries
belong in the myth category. Some devices can have a shorted
life expectancy such as that power switch and that disk
drive. But who cares? Once numbers are applied, then reality
takes on a whole different perspective.

Some components, such as CPU are power cycling most severely
when in normal operation. Did they forget to mention that?
If power cycling was so destructive to a computer, then it is
also so destructive to a TV. You have numbers from
datasheets. If power cycling shortens a computer's life
expectancy by a factor of ten, well, who cares if the computer
is still working 150 years from now.

As Ralph Mowery has so accurately noted - it does not matter
- once engineering numbers and facts are applied.

Hi W_tom, remember mee from the surgeprotection topic on
sci.electronics.misc? Once Tube2ic said you were a fraud, you kept silent,
so don't blame me for ignoring you now, even if it's unjustified.

Halfgaar
 
w_tom said:
Some components, such as CPU are power cycling most severely
when in normal operation. Did they forget to mention that?
If power cycling was so destructive to a computer, then it is
also so destructive to a TV. You have numbers from
datasheets. If power cycling shortens a computer's life
expectancy by a factor of ten, well, who cares if the computer
is still working 150 years from now.

(Read my other message first). I can't resist replying to this.

PARTS of a CPU are indeed powercycling very fast (in mine 1500000000 times
per second). But there is a difference. When you power on a monitor, the
power surge it creates is so large, that all the lights in the room dim for
a moment. That is bound to damage equipment. And when powersupplies power
on, there's a gigantic surge through all the equipment connected to it,
untill all the caps have fully charged. I believe this is one reason why
computer power supplies nowadays don't fully shut down but keep charge in
the caps.

Halfgaar
 
1) You are speculating as to how destructive a power on
transient must be. Can you even describe how a component is
being damaged by a change in power? You must to avoid junk
science reasoning. Speculation: dimming lights must be a
symptom of destruction. Why? What is the damage?

2) I remember you quite specifically. Considered posting a
comment in this thread about your ability to go blind when
numbers are provided. You prefer simple junk science 'sound
bytes' instead of numbers. Confronted by a title wave of
facts, you chose to ignore them all for a response that was
salesman friendly and fact devoid. Do you again take a
ostrich position? Feel free to tell Ralph Mowery that he too
does not know what he is talking about.

3) Just because lights dim does not justify this wild
conclusion:
"That is bound to damage equipment"
In fact, dimming lights suggest a building wiring problem.
Computers just don't draw that much power on powerup.

Learn about electrolytic capacitors, inrush current
limiters, and power supply design. You have not described
anything destructive, but have immediately seized upon
something 'different'; therefore it must be destructive.

Those light bulbs also dimmed; starved for voltage. Is that
also destructive to light bulbs?

Same junk science reasoning also claims that power cycling
light bulbs is destructive - shortens bulb life expectancy.
"Must be", they claim, "because a bulb suffers a large
temperature change". But again, when manufacturer data sheets
are consulted, even power cycling a light bulb is not
destructive - in direct contradiction to junk science
reasoning.

I can see your eyes glazing over which means again, I am
probably wasting good newsgroup bandwidth on someone who fears
numbers and the underlying details. You can only take a horse
to water, but not make him drink. Here is your opportunity to
choose junk science reasoning or learn by first using
numbers. Which will you choose? Learn how individual
components respond to power up or simply conclude that dimming
lights are proof that power up damage 'must' be happening. Go
with the numbers or use junk science speculation. Which way
to you think?
 
Same junk science reasoning also claims that power cycling
light bulbs is destructive - shortens bulb life expectancy.
"Must be", they claim, "because a bulb suffers a large
temperature change". But again, when manufacturer data sheets
are consulted, even power cycling a light bulb is not
destructive - in direct contradiction to junk science
reasoning.

Actually power-cycling a light bulb does VERY significantly shorten
it's lifespan. However, that's not quite the situation with computer
components, since we don't have any parts experiencing that extreme
and rapid a temp change. I would tend to agree that the wear from
cycling _most_ components is insignificant, as that wear is so slight
that it's not going to be the initial failure point or reason to
abandon the equipment (due to age, faster technology, etc).

Power-cycling a hard drive is the biggest potential for problem. A
few times a day isn't significant, but if it spins-down every 5
mintues it's just an unecessary risk. "Average" failure rates mean
little to the small yet still significant percentage of people who
will suffer a failure prematurely due to excessive power-cycling of
the hard drive.


Dave
 
Let's return to a simple problem. Computers don't draw
massive power on power up. But if lights dim, then a serious
problem may exist in the household wiring. For example, many
outlets are wired using that stablock connector rather than
wrapping the wire around a screw. It is sufficient for things
that don't require continuous or stable power such as lights.
But that intermittent nature of stablocks is terrible for
things dependent on continuous power such as computers.

Problem would probably be related to an outlet and wire
between computer's receptacle and breaker box CB. If computer
is dimming the light, then simply plug light into receptacles
on same circuit that are closer to breaker box. Once light
stops dimming, then wire between those two receptacles must be
visually verified for stablock - or even worse some
homeowner's handyman fix on a wire.

Again, a computer can never draw enough power on a 15 or 20
amp circuit to cause dimming incandescent bulbs. Numbers make
that obvious. Knowing theory is significant information
necessary to avoid failure. Failure so avoidable as to be
directly traceable to human. That dimming light bulb suggests
an electrical problem on that circuit. Maybe not a serious
problem. But a problem that could potentially be serious
enough to be worthy of investigation - for human safety
reasons.

I use my computers always during thunderstorms; in part to
follow those storms. Never suffer damage. Never need worry.
But then I have learned the theory and have significant
experience applying those numbers. Again, the numbers are
important. Without numbers, one might assume the computer is
drawing massive power on power up when in reality a human
safety threat may exist on wire inside the walls.
 
w_tom said:
Let's return to a simple problem. Computers don't draw
massive power on power up. But if lights dim, then a serious
problem may exist in the household wiring. For example, many
outlets are wired using that stablock connector rather than
wrapping the wire around a screw. It is sufficient for things
that don't require continuous or stable power such as lights.
But that intermittent nature of stablocks is terrible for
things dependent on continuous power such as computers.

Please enlighten me, what's stablock?

The room the computer is in was built a few years ago and the wallsockets
are connected by wires around screws. And besides the dimming of the lights
when I power on the monitor (it's not the computer, it's the monitor) there
are no problems with power stabilty I can see.
Problem would probably be related to an outlet and wire
between computer's receptacle and breaker box CB. If computer
is dimming the light, then simply plug light into receptacles
on same circuit that are closer to breaker box. Once light
stops dimming, then wire between those two receptacles must be
visually verified for stablock - or even worse some
homeowner's handyman fix on a wire.

I'll test that, with the light closer to the breaker box.

Halfgaar
 
Stablock (which might not be the exact term) is a hole in
the back of receptacles (which is easier to see in Lowes and
Home Depot than in the wall). Wire is stripped of insulation
and pushed into this hole. However some outlets create
intermittent or higher resistive connections. In the late
1960s, an electrician once explained why he did not use this
option - why he used the screws. When (not just if) one
outlet did not make a complete connection, the time (and
money) required to find that outlet cost more than the time
saved in using stablock connections.

This is before PC. Since then stablock can create problems
for any device that cannot withstand a 15 millisecond power
interruption - such as PCs. If your circuit was only wires
using receptacle screws AND if the circuit breaker for that
circuit has a good connection to break box bus, then dimming
lights would be from something unique. However if lights are
dimming, the offending location should be moderately but
noticeably warmer when conducting a significant load - ie a
hairdryer or something else on the order of 1000 watts.
 
Actually power-cycling a light bulb does VERY significantly shorten
it's lifespan. However, that's not quite the situation with computer
components, since we don't have any parts experiencing that extreme
and rapid a temp change. I would tend to agree that the wear from
cycling _most_ components is insignificant, as that wear is so slight
that it's not going to be the initial failure point or reason to
abandon the equipment (due to age, faster technology, etc).

You just made the correct statement. One has to look at the overall cost.
I saw a breakdown on the cost of light bulbs vers the time off and on. If
all costs are factored in (cost of bulbs and power to run them ) there is a
certain ammount of time that the bulb has to be off (when not in actual need
of light) to make it worthwhile to cut it off and on. It is not worth the
cost to leave a bulb on in a closet all the time but it might be in the
living room of a house. YOu need the light of a closet for about 10 minuits
out of 24 hours but depending on the house you might need the light in the
living room the biggest part of your waking hours. Don't cut it off and on
each time you enter and leave the room unless you are going to be out of it
for say a half hour to one hour.

All electrical devices are the same. When you are not actually using them
they are still drawing power and building up heat. The power is costing you
and the heat is not good either. One has to hit the happy medium of how
long to leave a device on when not in use. I usually use a two hour
timeframe on the computer at home. I have not real reason for this but it
seems to me that if I am not using it for a period of around two hours then
it is just wasting power. Sure the computer will die on me one day but more
than likely I will have replaced it before the on/off cycles cause a
failure. The last two computers I had did not give me any problems and I
gave them to my son and while I don;t know what he did as far as off/on they
lasted him tuil I gave him the next one. Who cares if the off/on cycles
kill a computer in 5 years vers 10 years if left on all the time. They will
probably be replaced before then anyway.
 
Either your house's electrics are seriously messed up or you're using some
big old or cheap monitor! I've never seen the lights dim just from turning
my monitor (about 70W) on...

<snip>

Most monitors degauss when first powered on, this (possibly depending
on the size of the tube) is likely several hundred or more watts.


Dave
 
Ralph said:
All electrical devices are the same. When you are not actually using them
they are still drawing power and building up heat. The power is costing
you
and the heat is not good either. One has to hit the happy medium of how
long to leave a device on when not in use. I usually use a two hour
timeframe on the computer at home. I have not real reason for this but it
seems to me that if I am not using it for a period of around two hours
then
it is just wasting power. Sure the computer will die on me one day but
more than likely I will have replaced it before the on/off cycles cause a
failure. The last two computers I had did not give me any problems and I
gave them to my son and while I don;t know what he did as far as off/on
they
lasted him tuil I gave him the next one. Who cares if the off/on cycles
kill a computer in 5 years vers 10 years if left on all the time. They
will probably be replaced before then anyway.

If you're lucky. Perhaps the computer will die after 3 years instead of 10,
which is more annoying. And my monitor (Eizo T766 19") will probably be in
use longer than 5 years.

Halfgaar
 
Zilog said:
And have you ever thought of the REASON for power management on computers?
Something most of you Americans (I don't know if you are one, but there
are many in this NG) don't know the MEANING of - ENERGY CONSERVATION!
There is a point in turning something off when you're not using it, even
if turning it back on may shorten its life to an extent. You don't leave
all your light bulbs on all day just because turning them off and on again
may shorten their life by a couple of hours. It's just COMMON SENSE. The
amount of money you save from doing this over the life of the lightbulb
will more than repay you for another goddamn bulb, anyway! Well, it
might...

I'm not american, but I agree with you that most americans don't know how to
be conservative with resources.

But anyway, I know what energy converservation is and what it's for. But if
I have to throw away my monitor for example after a couple of years that's
not very environmentally friendly. And my monitor is not exactly cheap.

Halfgaar
 
kony said:
Most monitors degauss when first powered on, this (possibly depending
on the size of the tube) is likely several hundred or more watts.


Dave

Exactly! When I degauss the lights dimm as well. And on powerup, the lights
dim at the moment the electromagnets engage (you "dong" sound you hear). My
monitor is 19" in size BTW. My 15" doesn't cause the lights to dimm.

Halfgaar
 
Back
Top