Computer life

  • Thread starter Thread starter eingram
  • Start date Start date
E

eingram

I have seen various discussions about whether or not money or power is saved
depending on turning the computer off periodically as opposed to just leaving
it on all the time.
Obviously it will use less power when off. My question concerns how this
practice affects the lifespan of the hardware. I know that in any
electronic device, when first powered up, there is an "inrush current"
required to charge capacitors (such as filter capacitors in the various power
supplies). Has there been any studies to determine the optimum on-off rate?
In other words, if it is only turned on once a day and off at night, is this
better than turning it on and off say 4 or 6 time in a 24 hour period? Any
discussion along this line would be helpful.
 
eingram said:
I have seen various discussions about whether or not money or power is
saved
depending on turning the computer off periodically as opposed to just
leaving
it on all the time.
Obviously it will use less power when off. My question concerns how this
practice affects the lifespan of the hardware. I know that in any
electronic device, when first powered up, there is an "inrush current"
required to charge capacitors (such as filter capacitors in the various
power
supplies). Has there been any studies to determine the optimum on-off
rate?
In other words, if it is only turned on once a day and off at night, is
this
better than turning it on and off say 4 or 6 time in a 24 hour period?
Any
discussion along this line would be helpful.

All the hardware related discussions I've read revolved around current use
by hard drive turn-on, not the PC itself as far as life expectancy. This
also occurs when waking from hibernation as the hard drives are powered down
during hibernation. Does not occur in standby.

I still have, as other people do, a PC a decade old with an AT power supply.
When I use it, I turn it on first. When I'm done, I turn it off. There is
no hibernate or standby use. It still works. I don't need a study to
figure out what that means, that is, what someone else deduces about that.
How it would compare if I setup the bios or the OS to use standby doesn't
matter. If it ain't broke, don't fix it.
Dave
 
I don't know about studies, but I did spend a few years in the labs of a
major US-based TV manufacturer some time back, and we used to run long term
life tests where sets were both run continuously and power was cycled. The
cycled sets failed sooner. The reason was thermal expansion and contraction.
The parts in any electric device will expand and contract as they heat and
cool. They will, naturally, expand and contract more if the thermal cycles
are more extreme.

In an office environment the best choice for users' systems seems to be turn
it on in the morning and turn it off when you leave. That's only two thermal
cycles a day, and lots of energy saved. A good balance. At home, we're not
talking about lots of PCs and lots of power, so energy isn't much of an
issue, unless you're running a green home. This is a wild guess, but
toasting a bagel uses the same amount of power your PC consumes in a day or
two. (Someone can figure this out, i'm too lazy.)

-John O
 
John

Your comments on thermal expansion and contraction are interesting. I
suspect there have been studies on the light bulb illustrating what you
say.

Doesn't obsolesence occur before many computer components cease to be
functional?

Leaving computers on also have an effect on system performance with
regard to programmes with memory leaks not releasing memory.

--
Regards.

Gerry
~~~~
FCA
Stourport, England
Enquire, plan and execute
~~~~~~~~~~~~~~~~~~~
 
I'm sure many companies have studied their own products, but a lot of that
data is proprietary and we won't get to see it, unfortunately. Such a test
is pretty easy, you need one of those timer-type AC switches
(http://tinyurl.com/2jh3ag) from Walmart and a control device that is left
on continuously. And a few weeks. As far as TVs are concerned, they
accelerated the tests by increasing the line voltage to 132 VAC and the
set's B+ (high voltage source) by 25% or something like that. Also, they
jacked up the ambient temp in the room to about 85 F, sometimes a lot higher
if there was a hurry.

-John O
 
John

Thanks John but that's way over my head.


--
Regards.

Gerry
~~~~
FCA
Stourport, England
Enquire, plan and execute
~~~~~~~~~~~~~~~~~~~
 
Incorrect calculation: Assume S3, hibernation, or off are all about the
same energy drain (< 5 watts). Assume on usage power is about 175 Watts,
off time is weekends + 12 hours per day. With these assumptions, the
value of turning the computer off (or hibernating or using S3 suspend)
is 1 MEGA WATT HOUR PER YEAR. Look at your electric bill for rates and I
think you will find this saves hundreds of dollars per year per
computer. Your bagel toasting, while more taste-enhancing than
computation, is rather inexpensive in comparison to a few days extra up
time.

-- Jeff Barnett
 
Well, I'm not confident that a home computer uses a lot of energy,
relatively speaking. The toaster may have been a poor example, it's a lot of
power and carbs but not much time.

http://ets.fhda.edu/call_center/greencomputing

A university site says basically the same thing:

"How a user operates the computer also factors into energy costs. First let's
take the worst case scenario, continuous operation. Assuming you operate a
200 watt PC system day and night everyday, direct annual electrical costs
would be over $125 (at $0.075/kWh). In contrast, if you operate your system
just during normal business hours, say 40 hours per week, the direct annual
energy cost would be about $30 - plus, of course, the cost of providing
additional cooling."

Consider the 40 hours a bit high, then consider most of our computers draw
*a lot* less than 200 watts, and the electric costs are pretty low in each
home.

I have an electric stove, microwave, air conditioning, pool filter, water
heater, dehumidifier, coffeemaker, and clothes dryer. Computer electrical
costs are way down on the "worry" list for me.

Here's another good and very basic discussion:
http://www.humboldt.edu/~mrd26/home_energy.htm

-John O
 
eingram said:
I have seen various discussions about whether or not money or power is saved
depending on turning the computer off periodically as opposed to just leaving
it on all the time.
Obviously it will use less power when off. My question concerns how this
practice affects the lifespan of the hardware. I know that in any
electronic device, when first powered up, there is an "inrush current"
required to charge capacitors (such as filter capacitors in the various power
supplies). Has there been any studies to determine the optimum on-off rate?
In other words, if it is only turned on once a day and off at night, is this
better than turning it on and off say 4 or 6 time in a 24 hour period? Any
discussion along this line would be helpful.


Although there is some stress on components during power-up...
it's best to turn the machine off when not in use (unless you are going to
get back to it shortly)


Here is *one* story, though there are many:

A few years back I built a machine for a professional photographer
who had a lot of data he needed to keep avail.
That was back before there were 1TB drives...so his machine had at least 5
or 6 drives total...
and he kept the machine on 24/7

I have never seen so many HD failures in my life...
I think that sooner or later they all died (and were replaced)


I advised him to keep the machine on only when in use...
he never lost a drive again...

that was quite a few years ago.
 
To Bob I:
I'm not concerned about power usage. As I said I want to know how much
turning on and off affects the life.
 
To JohnO:
As a former TV and VCR repairman, I would like to know just exactly the
method they used to increase the B+. Was it a switching supply or linear?
Also, the B+ is not the highest voltage in a set, but the 2nd anode voltage
is. This is the high (20000-30000+) voltage that's used in the CRT. Since
this voltage is derived from the B+, did they take precautions not to let
this voltage rise also? This would have caused excess radiation from the
CRT. Very dangerous.
Also, it's not just the thermal expansion and contraction. Elcxtrons,
though small cause heat fron the friction of their passage. Since the inrush
current is much greater than the run current, turning a set on and off has to
reduce the life. The question is: how much? That's the reason I was asking
about studies. I would rathe see empirical data than math.
 
It seems that electricity in your part of the country is much less
expensive then it is here - Southern California. By our calculations and
peeking at the utility bill, we are saving $15-20 per month per computer
using S3 instead of S1 suspend mode. That's quite a bit of energy.

-- Jeff Barnett
 
That would be virtually impossible to tell, you given all the variables
of the individual components in the computer. In practice the PC
itself will outlive it usefulness by virtue of it's ability to run the
desired software you acquire. On the other hand optical drives fail with
amazing frequency no matter what you do with the PC . So in a nutshell,
do what you wish as far as operating cycles, as luck of the draw will
determine the far as the component failure.
 
This was the engineering department of the manufacturer, a few hundred
people. There were a *lot* of very smart dudes there, and nothing was left
to chance. If we made a mistake in this lab and some engineer looked bad as
a result, we heard about it loud and clear. :-) While my recollections of
details may be fuzzy, this system flat out worked. Unfortunately, the name
went on even if the quality wasn't in there. ;-)

I don't recall exactly how they changed B+, the engineers that designed the
TV circuits were on the floor above us so modifications were simple enough.
Tubes were tested at the plant where they were manufactured, and I don't
remember seeing extra-bright images. It's likely the 2nd anode was turned
back to normal levels. At the time all the sets used linear supplies (mid
80's) and I don't know if they ever got to switching before being swallowed
up by the Japanese companies in the mid-90's.

Every model also went through other tests. For example, thermocouples were
attached to every active component and many other parts, maybe 100 in each
set. Then the set was run under controlled conditions and temps recorded.

I assure you that at one time there were piles of data and analysis, but
it's either in a landfill or in Japan now.
 
eingram said:
To JohnO:
As a former TV and VCR repairman, I would like to know just exactly the
method they used to increase the B+. Was it a switching supply or linear?
Also, the B+ is not the highest voltage in a set, but the 2nd anode voltage
is. This is the high (20000-30000+) voltage that's used in the CRT. Since
this voltage is derived from the B+, did they take precautions not to let
this voltage rise also? This would have caused excess radiation from the
CRT. Very dangerous.
Also, it's not just the thermal expansion and contraction. Elcxtrons,
though small cause heat fron the friction of their passage. Since the inrush
current is much greater than the run current, turning a set on and off has to
reduce the life. The question is: how much? That's the reason I was asking
about studies. I would rathe see empirical data than math.
Empirical data?
I've had 8 computers since 1983. One failed. Five became obsolete. Two
are still working -- two and five years old respectively.

I turn off my computers when I am not using them.

Bill
 
Jeff Barnett said:
Incorrect calculation: Assume S3, hibernation, or off are all about the
same energy drain (< 5 watts). Assume on usage power is about 175 Watts,
off time is weekends + 12 hours per day. With these assumptions, the value
of turning the computer off (or hibernating or using S3 suspend) is 1 MEGA
WATT HOUR PER YEAR. Look at your electric bill for rates and I think you
will find this saves hundreds of dollars per year per computer. Your bagel
toasting, while more taste-enhancing than computation, is rather
inexpensive in comparison to a few days extra up time.

This is something that we looked at when the envoronmentalists were banding
figures around about how a TV and DVD player together cost around £35 a year
to run when left in standby. I was very suspicious (normal for
environmentalist sourced figures - they are always exagerated), so out came
the wattmeter, and I measured the standby power of a 42 inch LCD TV, and a
Bluray player (near enough the same as a DVD player in standby). Running
the numbers I got the 35 tossed in by the environmentalists almost exactly .
Only I got 35 *pence* per annum for the 0.6 watts consumed, not 35 pounds -
not worth worrying about. A PC power supply consumes a similar amount when
in standby.

It should be remembered that during the winter months, the small amount of
heat dissipated by appliances in standby is offset by a corresponding (but
equally small) reduction in your heating bills.

By the way 1 megawatt hour per year is only a thousand units and will cost
substantially less than several hundred dollars per year. Even at UK
electricity prices it isn't much more than £80 a year. In reality the TV
and DVD on standby use just 5 kwh per annum - probably not enough to
overcome the stiction in the electric meter.
 
Jeff Barnett said:
It seems that electricity in your part of the country is much less
expensive then it is here - Southern California. By our calculations and
peeking at the utility bill, we are saving $15-20 per month per computer
using S3 instead of S1 suspend mode. That's quite a bit of energy.

He's got the decimal point in the wrong place. I doubt that anywhere on the
planet pays what he claimed for electicity. Even here in the UK (where
electricity is not considered cheap), is somwhat less than that.
 
M.I.5¾ said:
This is something that we looked at when the envoronmentalists were banding
figures around about how a TV and DVD player together cost around £35a year
to run when left in standby. I was very suspicious (normal for
environmentalist sourced figures - they are always exagerated), so out came
the wattmeter, and I measured the standby power of a 42 inch LCD TV, and a
Bluray player (near enough the same as a DVD player in standby). Running
the numbers I got the 35 tossed in by the environmentalists almost exactly .
Only I got 35 *pence* per annum for the 0.6 watts consumed, not 35 pounds -
not worth worrying about. A PC power supply consumes a similar amount when
in standby.

It should be remembered that during the winter months, the small amountof
heat dissipated by appliances in standby is offset by a corresponding (but
equally small) reduction in your heating bills.

By the way 1 megawatt hour per year is only a thousand units and will cost
substantially less than several hundred dollars per year. Even at UK
electricity prices it isn't much more than £80 a year. In reality the TV
and DVD on standby use just 5 kwh per annum - probably not enough to
overcome the stiction in the electric meter.

How very true, it seems that the "e-nuts" will focus so narrowly on one
aspect of an issue that they miss the point that the real overall effect
doesn't amount to anything when the rest of the "system" is taken into
consideration.
 
Back
Top