E
eingram
I have seen various discussions about whether or not money or power is saved
depending on turning the computer off periodically as opposed to just leaving
it on all the time.
Obviously it will use less power when off. My question concerns how this
practice affects the lifespan of the hardware. I know that in any
electronic device, when first powered up, there is an "inrush current"
required to charge capacitors (such as filter capacitors in the various power
supplies). Has there been any studies to determine the optimum on-off rate?
In other words, if it is only turned on once a day and off at night, is this
better than turning it on and off say 4 or 6 time in a 24 hour period? Any
discussion along this line would be helpful.
depending on turning the computer off periodically as opposed to just leaving
it on all the time.
Obviously it will use less power when off. My question concerns how this
practice affects the lifespan of the hardware. I know that in any
electronic device, when first powered up, there is an "inrush current"
required to charge capacitors (such as filter capacitors in the various power
supplies). Has there been any studies to determine the optimum on-off rate?
In other words, if it is only turned on once a day and off at night, is this
better than turning it on and off say 4 or 6 time in a 24 hour period? Any
discussion along this line would be helpful.