P
Paul
M.I.5¾ said:This is something that we looked at when the envoronmentalists were banding
figures around about how a TV and DVD player together cost around £35 a year
to run when left in standby. I was very suspicious (normal for
environmentalist sourced figures - they are always exagerated), so out came
the wattmeter, and I measured the standby power of a 42 inch LCD TV, and a
Bluray player (near enough the same as a DVD player in standby). Running
the numbers I got the 35 tossed in by the environmentalists almost exactly .
Only I got 35 *pence* per annum for the 0.6 watts consumed, not 35 pounds -
not worth worrying about. A PC power supply consumes a similar amount when
in standby.
It should be remembered that during the winter months, the small amount of
heat dissipated by appliances in standby is offset by a corresponding (but
equally small) reduction in your heating bills.
By the way 1 megawatt hour per year is only a thousand units and will cost
substantially less than several hundred dollars per year. Even at UK
electricity prices it isn't much more than £80 a year. In reality the TV
and DVD on standby use just 5 kwh per annum - probably not enough to
overcome the stiction in the electric meter.
Something you have to be careful of, is a lot of high tech electronics
use switching power supplies for power. When devices are in standby,
the load drops (that is good). But the current consumption waveform
becomes highly non-sinusoidal. This can make it difficult for some
current or power measuring devices to get a correct reading.
For example, I have a $300 clamp on ammeter, suitable for measuring
DC or AC current. It does a lousy job of measuring standby power on
a computer. I cannot count on it for that kind of measurement.
This thread mentions a technique I've used, to try to come to
grips with the inaccuracy. (Post #4 here.)
http://forum.ecomodder.com/showthread.php?t=610
1) Plug Kilowatt to wall.
2) Plug power strip into Kilowatt measurement socket.
3) Plug a purely resistive load into the power strip. You
want something which is stable with temperature. A light
bulb might suffice (but power draw will vary with instantaneous
line voltage, so if your voltage wanders all over the place,
that isn't going to help - my line voltage is pretty good).
4) Measure with the light bulb on (and leave the light bulb running).
5) Plug in computer. Boot and make your active measurement. Subtract
the light bulb amount from the reading. That is the active waste.
6) Now put the computer in standby. Make another measurement.
Subtract the light bulb amount from the reading. The reason
you have to boot the computer and then go to standby, is so
that any devices programmed to be active in standby, get
properly set up. Just plugging in the computer, without booting
it, will not give the same answer (as none of the standby power
consumers have been set up yet).
I believe when I was having trouble with my ammeter, measuring an
AC current accurately, I added an electric kettle to the circuit.
(Not a particularly good choice, but it was handy at the time.)
The sum total of kettle plus non-sinusoidal load, seemed to be closer
to the suspected amount, than just measuring the non-sinusoidal load
alone. Obviously, there isn't a lot of accuracy in measuring that
way, but it does avoid "order of magnitude" type errors (my meter was
*way* off). You can be a lot further off than the 2% accuracy the
product claims.
This isn't going to make a big difference in your $ calculations,
but I just wanted to put on the record, that you cannot believe
everything you see (or measure).
Paul