Cost of running a PC?

  • Thread starter Thread starter terry smith
  • Start date Start date
T

terry smith

How much does it cost power wise?

In the UK it is about 6-8p per KWH (I think) so....
going on 7p and with my PC's 90Watt (don't laugh) power suppply.
We have (well call it 100watts for simplicity).

In one day I use 24X100=2.4KWH so about 17p so in a year
thats.......£62 quid!!!

Now these days 300Watt is more typical so its is £182, which is
quite a lot!!! (are my figures correct? seems quite high, more than
I pay for my internet connection (£150 P/A dial up))

Note I have not included the monitor at all!!

Another point is that my PC will probably not be using the full
90Watts, or will it? How much will it use on average?
How much will it use when it is idle?
 
How much does it cost power wise?

Thats an interesting question. Id like to know myself.

I remember there was the big controversy about power usage when there
were brownouts in Calif. Some "experts" claimed the huge upsurge in PC
, high tech usage in the 90s - combined with the lack of increases in
generation capacity caused a big problem --- then suddenly there was
this claim going around that some guys debunked that myth. They
claimed PCs really dont use that much power . I dont remember the
figures but they claimed it was quite small compared to most other
things.

Id like to know how much it does use.
 
terry said:
How much does it cost power wise?

Switching power supplies are only something like 2/3 efficient. So a
300W power supply at full capacity is using around 450W.
 
How much does it cost power wise?

In the UK it is about 6-8p per KWH (I think) so....
going on 7p and with my PC's 90Watt (don't laugh) power suppply.
We have (well call it 100watts for simplicity).

In one day I use 24X100=2.4KWH so about 17p so in a year
thats.......£62 quid!!!


Nope, if the power supply is 90W, you might be using closer
to 70W, in use, and a lot less when idle, particularly if
it's modern enough to use power management to spin down
drives, go into sleep mode, and use ACPI to halt-idle the
CPU.
Now these days 300Watt is more typical so its is £182, which is
quite a lot!!! (are my figures correct? seems quite high, more than
I pay for my internet connection (£150 P/A dial up))

Power supply wattage rating is it's capability, not it's
energy usage. The typical PC running from a 300W PSU uses
less than 200W, sometimes a LOT less (yours could've had a
300W PSU in it, instead, for example, but the power usage
would be almost identical (ignoring differences in
efficiency from one PSU to the next).

Note I have not included the monitor at all!!

If you have a 17-19" CRT, it's probably using as much as the
whole computer, until power management kicks in.

Another point is that my PC will probably not be using the full
90Watts, or will it? How much will it use on average?
How much will it use when it is idle?

Depends on the specific system. Nailing down an exact
figure isn't really necessary, rather a ballpark figure, but
if it bothers you that much then don't run it 24/7. In
wintertime it may suppliment your heater so with proper
environmental isolation your heater itself will run less.
In summer the opposite if you have AC, the heat is
additional expense. Those watts used are all ending up as
heat.
 
Thats an interesting question. Id like to know myself.

I remember there was the big controversy about power usage when there
were brownouts in Calif. Some "experts" claimed the huge upsurge in PC
, high tech usage in the 90s - combined with the lack of increases in
generation capacity caused a big problem --- then suddenly there was
this claim going around that some guys debunked that myth. They
claimed PCs really dont use that much power . I dont remember the
figures but they claimed it was quite small compared to most other
things.

Id like to know how much it does use.

Seems like the trend to have multiple and larger television
sets might've had just as much of an impact, a lot of people
do turn off their systems when they're done with it, which
is all day at work and overnight, leaving only 10 hours or
so potential window of opportunity.
 
Matt said:
Switching power supplies are only something like 2/3 efficient. So a
300W power supply at full capacity is using around 450W.

Thats a pretty frightening figure, its half an electric fire!!
 
kony said:
Seems like the trend to have multiple and larger television
sets might've had just as much of an impact, a lot of people
do turn off their systems when they're done with it, which
is all day at work and overnight, leaving only 10 hours or
so potential window of opportunity.

My PC tend to be on a lot although I do use power management
(sometimes).
It os too much hassel to turn it off during the day incase I
might want to use it quickly.

IF you were hosting your own web site you might
want it in 24/7
 
kony said:
Nope, if the power supply is 90W, you might be using closer
to 70W, in use, and a lot less when idle, particularly if
it's modern enough to use power management to spin down
drives, go into sleep mode, and use ACPI to halt-idle the
CPU.


Power supply wattage rating is it's capability, not it's
energy usage. The typical PC running from a 300W PSU uses
less than 200W, sometimes a LOT less (yours could've had a
300W PSU in it, instead, for example, but the power usage
would be almost identical (ignoring differences in
efficiency from one PSU to the next).



If you have a 17-19" CRT, it's probably using as much as the
whole computer, until power management kicks in.



Depends on the specific system. Nailing down an exact
figure isn't really necessary, rather a ballpark figure, but
if it bothers you that much then don't run it 24/7. In
wintertime it may suppliment your heater so with proper
environmental isolation your heater itself will run less.
In summer the opposite if you have AC, the heat is
additional expense. Those watts used are all ending up as
heat.

Yes I realise thath it would be a source of heat, like a couple
of light bulbs left on 24/7 (I have the low energy ones now).
But is is very expensive heat compared to gas.
Well over 4 times more expensive.
 
Thats an interesting question. Id like to know myself.

I remember there was the big controversy about power usage when there
were brownouts in Calif. Some "experts" claimed the huge upsurge in PC
, high tech usage in the 90s - combined with the lack of increases in
generation capacity caused a big problem --- then suddenly there was
this claim going around that some guys debunked that myth. They
claimed PCs really dont use that much power . I dont remember the
figures but they claimed it was quite small compared to most other
things.

Id like to know how much it does use.

Well quite a bit if they are always on.
Big unnoticed uses of power are fridges and deep freezes.
I remember when I was young my mother was concerned about how
big the electriciry bill was, she thought it was me always leaving lights
on.

Anyway one day we truned all the electrical appliances off and looked at the
meter which was still spinning pretty rapidly we were a bit puzzled untill
we realised that the fridge and deep freeze were still on.

A small fridge ia about 100watt (i think) and costs about £40 (or more)
to run.
So a computer using about 300 will cost well over £100 if used 24/7.
I guess the only way to find out is to do what we did and switch off
all your electrical appliance bar your computer then, look at the
electricity
meter.

A while back I was trying to fiind out how much power my computer used
and this is obviously the easiest and most accurate method.
The other method involved loads of amp and volt meters!!

Be interesting if people with various computers switched off all
other devices and took a meter readiing so we could compare the results.
 
Yes good point about the LCD monitor, they do cost more
but when you take the cost of power, which most people
ignore, they are probably pretty good value over their
average life time.

Just a comment on those benchmark tests, quite a low of
power is used when the system is 'idle' 65%-75% is typical.

Those Pentium P4's are pretty hungry beasts!!

It would be interesting to know how long you needed to run
a PC before you have spent more on power than it cost to
buy the PC!!
 
Thats a pretty frightening figure, its half an electric fire!!

Even the high-end modern systems with fastest video cards
out there don't use 300W at peak... it'd take a large server
or SMP workstation to do that.
 
Yes I realise thath it would be a source of heat, like a couple
of light bulbs left on 24/7 (I have the low energy ones now).
But is is very expensive heat compared to gas.
Well over 4 times more expensive.

Yes, gas is often cheaper... until your furnace goes out!
Last time I had to replace a circuit board for ignition, it
cost about $250 for a board that should've cost $10. If I
had to do it again i'd build one or repair it, but that's
not always possible during winter, you're in a bit of a rush
to get it working again.
 
terry said:
.... snip ...

A while back I was trying to fiind out how much power my computer
used and this is obviously the easiest and most accurate method.
The other method involved loads of amp and volt meters!!

Be interesting if people with various computers switched off all
other devices and took a meter readiing so we could compare the
results.

I did this at a time when the refrigerator, water heater, and
furnace were all cycled off, and as far as I know did not come on
during the run. The technique is to time one turn of the 'motor'
wheel of the power meter, with a known minimum load.

I estimate my minimum load to be 82 watts, from a variety of
compact fluoroscent bulbs. At any rate, the results:

computer TV Time(secs)
on on 64
off on 89
off off 240
on off 115
pwr down off 157
From which I deduce total draw to be (19680 / secs) watts. This
gives power drain (watts):

TV 139
comp 87 to 91
comp in pwr down 43

The power down condition is after the screen has blanked and all
disks have spun down. This is on a '486 system. My estimated
error is 10 to 15%. The TV off condition is not zero power,
because it has a keep-alive system.
 
CBFalconer said:
I did this at a time when the refrigerator, water heater, and
furnace were all cycled off, and as far as I know did not come on
during the run. The technique is to time one turn of the 'motor'
wheel of the power meter, with a known minimum load.

I estimate my minimum load to be 82 watts, from a variety of
compact fluoroscent bulbs. At any rate, the results:

computer TV Time(secs)
on on 64
off on 89
off off 240
on off 115
pwr down off 157

gives power drain (watts):

TV 139
comp 87 to 91
comp in pwr down 43

The power down condition is after the screen has blanked and all
disks have spun down. This is on a '486 system. My estimated
error is 10 to 15%. The TV off condition is not zero power,
because it has a keep-alive system.
Yes a problem with my method is it is a lot of hassel turning many devices
off because you will often wipe the memory which can mean clocks
losing the time, TV devices losing their tuning and other stuff (VCR
settings
for programs to record etc...) and answering machines losing their stuff
(pre-recorded messages, numbers etc....).
Hence I didn't take my own reading because I was to worried about having
to reprogram stuff. I know some of my devices will retain their setting
without
power butI am not sure which ones.

So your method would be better with a minimum know load.

Basically I would just need to turn the fridge off as everything else will
be
pretty constant.
I will try and post a reading later!!
 
terry said:
Yes a problem with my method is it is a lot of hassel turning
many devices off because you will often wipe the memory which can
mean clocks losing the time, TV devices losing their tuning and
other stuff (VCR settings for programs to record etc...) and
answering machines losing their stuff (pre-recorded messages,
numbers etc....). Hence I didn't take my own reading because I
was to worried about having to reprogram stuff. I know some of
my devices will retain their setting without power but I am not
sure which ones.

With my technique you can calibrate your meter with one known DELTA
value, provided that is an appreciable fraction of the total power
drain. You don't really need to know the minimum.
 
How much does it cost power wise?

In the UK it is about 6-8p per KWH (I think) so....
going on 7p and with my PC's 90Watt (don't laugh) power suppply.
We have (well call it 100watts for simplicity).

In one day I use 24X100=2.4KWH so about 17p so in a year
thats.......£62 quid!!!

Now these days 300Watt is more typical so its is £182, which is
quite a lot!!! (are my figures correct? seems quite high, more than
I pay for my internet connection (£150 P/A dial up))

Note I have not included the monitor at all!!

Another point is that my PC will probably not be using the full
90Watts, or will it? How much will it use on average?
How much will it use when it is idle?

A regular desktop PC will consume somewhere around 30W-
60W. Gaming rig might eat 75-100W. Hard drives are 8-
10W (if you have multiples). A 19" monitor eats 140W or
so (LCD displays are down around 40W). One of these
days, I'll hook the power meter up and get numbers off
my systems.

LCD displays still aren't quite cheap enough to be worth
it based on power savings. Figure 2000 hours of use per
year (that's a 40 hour workweek, plus 2 weeks vacation),
and you'll find that it takes 5 years or so to make up
the cost difference.

What might tip the balance in favor of LCDs is cooling
needs, as they throw off less heat, but that's a real
tough number to guess. Or if you're space-limited, LCDs
are worth the premium.
 
How much does it cost power wise?
=A regular desktop PC will consume somewhere around 30W-
=60W. Gaming rig might eat 75-100W. Hard drives are 8-
=10W (if you have multiples). A 19" monitor eats 140W or
=so (LCD displays are down around 40W). One of these
=days, I'll hook the power meter up and get numbers off
=my systems.

=LCD displays still aren't quite cheap enough to be worth
=it based on power savings. Figure 2000 hours of use per
=year (that's a 40 hour workweek, plus 2 weeks vacation),
=and you'll find that it takes 5 years or so to make up
=the cost difference.

(First ly sorry about the indent problem)
Well I reckon I would hit the 40 hours a week proabably
and my current monitor is over 5 years old so
maybe it would be worth it for me.
Also a what is the life time of an LCD monitor? I would assume
a very long time or untill you drop or knock it over.

=What might tip the balance in favor of LCDs is cooling
=needs, as they throw off less heat, but that's a real
=tough number to guess. Or if you're space-limited, LCDs
=are worth the premium.

Yea I think I could make do with more space on my desktop too.

What puts me off is the big 'upfront' spent it would be like money
in the back for me, paying a little interest back each year.

Also if LCD monitors do have a very long life-time the resale
value might be considerable.
I would not expect a second hand vacuum tube monitor to
last too long.

LCD screens do tend to be a but duller in my opinion, so I
will have to see if the picture quality is good enough for me.
 
kony said:
Yes, gas is often cheaper... until your furnace goes out!
Last time I had to replace a circuit board for ignition, it
cost about $250 for a board that should've cost $10. If I
had to do it again i'd build one or repair it, but that's
not always possible during winter, you're in a bit of a rush
to get it working again.

yep sounds like you were fleeced, but even so still cheaper in
the long run.
Fortunately my brother is qualified Gas Service engineer trained
by British Gas so I would imagine he would be able to sort me
out something cheaper should the worse come to the worst.
Probably not much to those circuit boards anyway.
 
Toshi1873 said:
.... snip ...

LCD displays still aren't quite cheap enough to be worth
it based on power savings. Figure 2000 hours of use per
year (that's a 40 hour workweek, plus 2 weeks vacation),
and you'll find that it takes 5 years or so to make up
the cost difference.

What might tip the balance in favor of LCDs is cooling
needs, as they throw off less heat, but that's a real
tough number to guess. Or if you're space-limited, LCDs
are worth the premium.

If you are air-conditioning, it takes at least 2 watts to get rid
of one watt of heat. So, for cost estimates, at least triple the
display power usage.
 
Back
Top